The Crypto Comms #49; Notes and other Stuff

Chris on Crypto
7 min readApr 11, 2023

Artificial intelligence is all the rage nowadays. Arguments tend to range from ‘everyone’s going to be jobless’ to dismissive commentary about generative LMs along the lines of ‘it makes mistakes therefore it’s useless!’

There’s little to no editorial oversight in this write up, hence the multiplicity of topics. If there’s one point I’d like to get across, it’s to step outside your echo chamber and look around every so often.

In this issue

  • Digital scarcity
  • AI perspectives
  • Government overreach
  • Bitcoin update
  • Listening material

Digital scarcity

Extreme takes tend to attract more scrutiny, and rightly so since they’re often incorrect. After all, scrutiny is the mechanism by which we discern whether something is true or false. For example, hard-line bitcoiners believe there will only be one winner in a high inflationary fiat world, not a basket of hard-assets such as Litecoin, gold and silver. ‘You can’t create digital scarcity twice’, they say. This stance is popular, yet falls apart under modest scrutiny as it doesn’t take into account historical phenomena such as the gold-silver relationship, or the fact that people store value in playing cards, collectibles, classic cars, property and a whole list of items. While there are certainly arguments for certain store holds of wealth over others, it’s not cut and dry as some would argue.

Gold & Silver surge in tandem throughout history

After all, scarcity-markets exist in multiple assets in the physical domain, so are virtual assets an exception (CS skins)? Once you’ve meditated on this question, you’ll notice that the main difference in the virtual domain is largely bearer asset transaction speed, for which Litecoin ironically wins out since blocks are pumped out every two and a half minutes instead of ten. Regardless, that’s not the point; the point is that both historical precedent and human nature trend towards multiple winners which have exceptional fundamentals, not one.

AI perspectives

Absolutist narratives are also present in the emerging AI landscape. Over the top claims about ‘artificial general intelligence’ (AGI) are generally fan service aimed at garnering the attention of people who haven’t read a single informative article about AI, yet saw the terminator films. Over the years, I’ve learned that the middle ground tends to be an accurate position to take when new technology enters mainstream awareness. The expectation of rampant job loss due to ‘automation’ is a tale as old as time, which gets recycled with each innovation: the combine harvester, cars, the printing press, the internet and so on and so forth.

Real progress, which doesn’t involve intermittent venture-capital pump and dump schemes, is generally two-fold; on the one hand it turns basic jobs that people would rather not do redundant, and on the other hand it creates new jobs and avenues where people can learn new skills and improve their value proposition to society. After all, you can’t go wrong with investing in yourself. Innovation offers this opportunity.

For centuries, we’ve deployed automation and thrived as a consequence. The proliferation and integration of AI language models into society at large will be no different, probably. Still, this time-tested truism hasn’t stopped the same dinosaur organisations from predicting the imminent obsolescence of human beings.

In 2016, PriceWaterhouseCoopers said the following in a report:

[T]he estimated proportion of existing jobs at high risk of automation by the early 2030s varies significantly by country. These estimates range from only around 20–25% in some East Asian and Nordic economies with relatively high average education levels, to over 40% in Eastern European economies where industrial production, which tends to be easier to automate, still accounts for a relatively high share of total employment.

The now-infamous World Ecommunist Forum had a similar take at the time, saying that 40% of jobs were at risk due to automation.

The question that arises is: what does automation actually mean, and why are global mega-organisations all-too-eager to jump on the doomsday bandwagon time and again?

  • Does it mean that a human is replaced by a machine and must now be on welfare (UBI)?
  • Does it mean that this human must find a new job with the same employer?
  • Does it mean that the human’s job is now augmented with machine tasks, and that he can learn and develop new skills?

These are pertinent questions which are never really addressed, except with the catch-all phrase of ‘automation’. These reports and studies, do not define what automation means for aggregated labour markets. Instead, they simply push the idea that ‘you are now obsolete and will be on food stamps soon’ — which is never really the case. Similar to claims of AGI, AI automation can mean anything depending on researcher subjectivity, and therefore nothing. Doom studies amount to eyeballing what researchers think can and cannot be automated depending on their own subjective take, and without any consideration for human agency.

But it’s not all bad, far from it. Often enough, researchers’ tone gives clues as to whether the report is worth considering. For example, this Goldman Sachs report avoids hysterics and takes the reasonable position that new technology leads to job creation.

Technology can replace some tasks, but it can also make us more productive performing other tasks, and create new tasks — and new jobs. GS cites research that finds “60% of workers today are employed in occupations that didn’t exist in 1940, implying that over 85% of employment growth over the last 80 years is explained by the technology-driven creation of new positions.” The GS operating assumption here is that GenAI will substitute for 7 percent of current US employment, complement 63 percent, and leave 30 percent unaffected.

Note the radical difference between this report versus the WEF and co. viewpoints on the emergence of new AI constructs in relation to the job market.

Government overreach

As with all things, it boils down to motives, and who is developing what technology, and for what purpose. For instance, while ChatGPT is both a playful and useful iteration of generative AI technology for creatives, content creators, programmers, and so on and so forth; in the hands of today’s overreaching governments, it will likely be used as a tool to enforce government-stamped narratives.

Government-controlled AI is a problem since governments have been at odds with those they’re sworn to protect and serve for at least three years by my count.

Indeed, there has yet to be baseline course correction following the human rights abuses of the 2020s under the pretext of a pandemic that was akin to a bad flu season. While some corrections have started in the Netherlands, New Zealand, France and elsewhere, far more is needed to restore faith in governments — if that’s even possible. On a positive note, a White House press release on April 10th officially ended the fake ‘national emergency’ in the United States.

But this is not enough.

Normal people endured three years of totalitarian hell, lost jobs, were barred from restaurants, supermarkets, encouraged to spy and rat on each other, treated as pariahs, mocked and ridiculed, obliged under threat of law enforcement to go back and forth to court summons for not wearing a piece of cloth on their faces, robbed of their right to free assembly, and absolutely discriminated against for exercising the right to bodily autonomy. Not to mention the $6 trillion-plus in cash handouts which created the biggest bubble ever assembled in the history of corporatism. Meanwhile, those politically connected to the money printer made billions on the back of this misery.

Tell me, is deep government and corporate distrust paranoia, or is it born out of real concerns and an unwillingness to simply ‘forget’ what was done to us? Think about it, if there is no tangible course correction, what’s to stop these bad decision makers from doing it all over again? Absolutely nothing.

Lucius Annaeus Seneca famously said, ‘to err is human’ but ‘to persist’ is diabolical’.

Bitcoin update

At the time of writing, Bitcoin/Dollar has reached the $30,000 mark, a price not seen since June 2022 when the Luna ponzi scheme blew up. Bitcoin has officially entered its prior bull-market trading range after successfully back-testing its long-term cost-basis.

Bitcoin has established a higher-low, higher-high price structure in what has become a text-book bull market. The above context is not Gospel truth, and price can revisit sub-$30,000 at any moment. Considering the fact that this asset is moving higher with a backdrop of uncharted macro-economic territory, high conviction time-based price predictions are challenging at the moment. Bitcoin is up 94% from the lows and it’s prudent to wait for a period of consolidation before penning a medium-term flight trajectory.

That said, I’m fairly confident that a low-six-figure bitcoin and a low-four-figure Litecoin are likely conservative estimates for this bull market cycle.

Cheers.

Listening material

Dear readers,

The purpose of this newsletter analysis is to provide context to current events and cryptocurrency markets. It is released every Tuesday. I am not perfect and this is not a science — nor is this newsletter analysis a signals service or financial advice. While I cannot promise perfection I do my best to be honest and transparent.

Thank you for reading.

Contact

Contact me with feedback on contact@chrisoncrypto.com.
Join the Telegram channel for live updates.
Follow me on Twitter.

Get in touch. Let’s work together.
Telegram: @chrisoncrypto

Support

You can show your support through the Bitcoin Lightning network. Kindly consider sharing this write-up if you find it useful.

BTC LN:

lnbc1p3l890tpp5yfnjh5pkk00k8n383895x4c6hffq94uu8gmaz89jvzlxnrLNURL1DP68GURN8GHJ7AMPD3KX2AR0VEEKZAR0WD5XJTNRDAKJ7TNHV4KXCTTTDEHHWM30D3H82UNVWQHHYMM5W3JKU6N0W4EXUETEXCCS6PKLQF

--

--

Chris on Crypto

Journalist-turned crypto-writer & analyst; forging the narrative, stacking sats.