il giorno delle parole @@@morfiche

“I clarify: that was truth, not humor. The GPT setup is precisely isomorphic to training a (huge) neural net to compress online text to ever-smaller strings, then using the trained net to decompress random bytes.”

isomorphic: mapping between 2 structures that preserver the structure and can be reversed.

” as organisms become more and more complex through evolution, they need to model reality with increasing accuracy to stay fit. At all times, their representation of reality must be homomorphic with reality itself. Or in other words, the true structure of our world must be preserved when converted into your brain’s representation of it.”

from here

Homomorphism in Algebra is a structure-preserivng map between 2 algebric structures of the same type (not reversible like isomorphic?)

Brain links 16 sep 2020

Joscha Bach on GPT-3 GPT-3 moves in a semantic space, masters relations between words, but can only deepfake understading

Joshua built the MicroPSI architecture based on PSI Theory

Curious Wavefunction makes a history of information and thermodynamics

ends it with the idea that our brain is a mixture of digital and analog processes, as posited by Von Neumann

Sav Sidorov another Joscha Bach video with highlights

“Some people think that a simulation can’t be conscious, and only a physical system can, but they got it completely backwards. A physical system cannot be conscious, only a simulation could be conscious.”

Connection between information and entropy isn born: “When Shannon showed his result to the famous mathematician John von Neumann, von Neumann with his well-known lightning-fast ability to connect disparate ideas, immediately saw what it was: “You should call your function ‘entropy’”, he said, “firstly because that is what it looks like in thermodynamics, and secondly because nobody really knows what entropy is, so in a debate you will always have the upper hand.” Thus was born the connection between information and entropy.”

AI = General methods + power of the computers

like Communism was Eletrification + power of the soviet.

The bitter lesson o fRich Sutton

Screenshot 2020-09-11 at 11.09.08

from Gwern May newsletter on scaling and metalearning:

“The scaling hypothesis regards the blessings of scale as the secret of AGI: intelligence is ‘just’ simple neural units & learning algorithms applied to diverse experiences at a (currently) unreachable scale.”

This is related somehow, distributed intelligence and fungi from The Curious Wavefunction, Life Distributed

Scaling hypothesis in AI

start from Gwern here

parameters scaling in GPT-3 does not run into linear scaling of performance nor dimishing returns. Rather it shows metalearning enhancing the performance

It was forecast by Moraves and since we are in a fat tail phenomenon this holds true: “the scaling hypothesis is so unpopular an idea, and difficult to prove in advance rather than as a fait accompli“. Before GPT-3 another epiphany on the scaling was the google cat moment which started the deep learning craze

Another idea which I like is that models like GPT-3 are definitely cheap and if they show superlinear growth it is a no brainer to go for bigger and more complex models, it is along way before matching the billions of expenses for Cern or nuclear fusion.

Carig Venter synthetic bacteria project cost us 40 milion, ground braking orojects costing so little should not be foregone

BTW to grasp the idea of how there could be a scaling benefit in growing deep learning sizes, go no further that a simple, unfounded but suggestive analogy with Metcalfe law of networks, network value grows with the square of nodes.

Dick joke

“A man is at the doctor’s office, and the doctor tells him, “I’ve got some good news and some bad news for you.” The man says, “Well, I can’t take the bad news right now, so give me the good news first.”

/ The doctor says, “Well, the good news is that you have an 18-inch penis.”

The man looks stunned for a moment, and then asks, “What’s the bad news?”

/ The doctor says, “Your brain’s in your dick”

Dick joke? anzi Doctor & dick joke su Linkedin, perchè?

Se ti ha fatto ridere pensa che è stato generato da GPT-3, il modello di generazione di testo sviluppato da OpenAI, successore del molto pubblicizzato GPT-2 rispetto al quale mostra 117x dimensioni: 115 miliardi di parametri

la curva di aumento dei parametri dei modelli segue ancora una power curve, su un sentiero di crescita esponenziale

con l’aumento della dimensione migliorano anche il meta apprendimento e la stabilità, la scala è ancora una strategia vincente per la performance dei modelli di neural networks

GPT-3 riesce a creare testi soprendenti a partire da una travccia e anche a inventare batturte. Diciamo però che non c’ corrispondenza tra testo e realtà nel modello, GPT-3 è un idiota saggio, ma molto saggio. questo lo leggete da Gwern

Alla prova anhche con la matematica e gli scacchi riesce ad avere prestazione decente, questo potreste leggerlo da Scoot Alòexander di Slate Star Codex, blog e comunità neorazionalista, best blog in town. Solo che il New Yourk Times minaccia di pubblicare il vero nome del blogger, esponendolo a danno professioanle e pericolo e allora Scott ha messo offline il suto

che mondo, AI fanno dick jokes a giornali fanno doxxing

Deeplearning and organic chemistry

there’s things so specific that you never get to know them and if you do you are not able to retrieve unless you have noted down its name or some keyword

“neural networks applied to computed molecular fingerprints or expert-crafted descriptors and graph convolutional neural networks that construct a learned molecular representation by operating on the graph structure of the molecule.”

from here

more AI and drug discovery though

the idea that AI reinforces totalitarian weaknesses

Fear not China, Seeing Like a Finite State Machine

“The theory behind this is one of strength reinforcing strength – the strengths of ubiquitous data gathering and analysis reinforcing the strengths of authoritarian repression to create an unstoppable juggernaut of nearly perfectly efficient oppression. Yet there is another story to be told – of weakness reinforcing weakness. Authoritarian states were always particularly prone to the deficiencies identified in James Scott’s Seeing Like a State – the desire to make citizens and their doings legible to the state, by standardizing and categorizing them, and reorganizing collective life in simplified ways, for example by remaking cities so that they were not organic structures that emerged from the doings of their citizens, but instead grand chessboards with ordered squares and boulevards, reducing all complexities to a square of planed wood

The latest link is where Italo Calvino can be used as perfect metaphor of epistemic confrontation in the field of politics

AI took over in the 80’s

If you are worried now that AI will took over the world and take away our jobs, think better, the AI took over the world in the 80’s

for example, one sign that the AI’s rule is that they push humans into lower productivity jobs, exactly what happened in the 80?s when the first robot arms appeared in the factories

Maybe you did not notice, Trump successful campaign in 2016 was bankrolled by the Mercer family, billionaries since they built in the 80’s a robot-trading fund which netted 100 billions of profit since 1990.

Maybe you should also notice that latest Tesla Cybertruck with futuristic shape come straight from Giorgietto Giugiaro prototype of  1978, don’t be supreised if maybe one day Musk is outed as an AI built to reenact Vincenzo Lancia’s genius


A single drop of blood, from a finger prick, it’s not  Theranos but Octopi, a cool  medical innovation for the 3 billions at risk of malaria: “a low-cost ($250-$500) automated imaging platform that can quantify malaria parasitemia by scanning 1.5 million red blood cells per minute.”

Malaria diagnosis today takes 30 minutes to 1 hour technician’s work on a manual microscope, putting a limit to the capacity of diagnosis centers in poor countries. Octopi combines microscopy, spectroscopy and flow cytometry to deliver a result in few minutes . Screenshot 2019-11-22 at 10.31.18.pngScreenshot 2019-11-22 at 10.32.13.png

Octopi works off a phone charger. It analyzes slides at speeds that are 120 times faster than traditional microscopy. Weighing fewer than seven pounds, it’s portable. And at a do-it-yourself cost of $250 to $500. Its modular architecture means that it takes only changing the camera/imager to detect other parasites. It’s open technology, hardware and software

“We further implement a machine learning classifier and obtain anticipated performance of higher than 90% specificity and sensitivity for parasitemia of 50 parasites per µl and 100% sensitivity and specificity for parasitemia of 150 parasites per µl. Our results suggest that low-cost automated multimodal microscopy combined with machine learning tools have the potential to address the unmet needs for diagnosis of malaria and many other diseases.”