Ho๐ŸŽญ๐Ÿ”ฎ๐Ÿ–๐ŸŒโšก๏ธ๐ŸŽฉ๐Ÿ”ฅ pfp

Ho๐ŸŽญ๐Ÿ”ฎ๐Ÿ–๐ŸŒโšก๏ธ๐ŸŽฉ๐Ÿ”ฅ

@giveup

301 Following
112 Followers


Vitalik Buterin pfp
Vitalik Buterin
@vitalik.eth
The differences between the APIs of numpy, cupy and torch are so fascinating.... ``` >>> import torch as np >>> np.arange(10) tensor([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]) >>> a = np.zeros(20) >>> a[19:-1:-2] = np.arange(10) ``` Torch doesn't let you have ranges that go backwards ๐Ÿคฃ Would love more consistency
0 reply
168 recasts
840 reactions

W1NTฮžR pfp
W1NTฮžR
@w1nt3r
Make it run Make it right Make it fast โ†
0 reply
3 recasts
65 reactions

Vitalik Buterin pfp
Vitalik Buterin
@vitalik.eth
New way to encode a profile picture dropped: https://x.com/Ethan_smith_20/status/1801493585155526675 320 bits is basically a hash. Small enough to go on chain for every user.
3 replies
546 recasts
2371 reactions

balajis pfp
balajis
@balajis.eth
AI as lossy compression. With a big model you might be able to do quite a lot with a few bits. You could maybe stack this on top of ZK for extreme levels of onchain compression and offchain reconstruction.
0 reply
23 recasts
173 reactions