Ankur Goyal pfp

Ankur Goyal

@ankrgyl

51 Following
66 Followers


Ankur Goyal pfp
Ankur Goyal
@ankrgyl
What do folks here think of mastodon?
3 replies
0 recast
0 reaction

Ankur Goyal pfp
Ankur Goyal
@ankrgyl
Prompts and Programs: how do compilers designed with LLMs in mind change the future of programming? https://basecase.vc/blog/prompts-programs Covers 5 use cases: - No Code 3.0 - LLM libraries - Mimicking compilers - Optimizing code - LLM databases w/ links to research and repos. Appreciate feedback!
1 reply
0 recast
0 reaction

Ankur Goyal pfp
Ankur Goyal
@ankrgyl
Is anyone thinking about the intersection of blockchain + OSS monetization? I'm curious of the latest & greatest thinking.
0 reply
0 recast
0 reaction

Ankur Goyal pfp
Ankur Goyal
@ankrgyl
I'm thinking about writing a new programming language: an ANSI SQL compatible language that natively interoperates with real code (Python, Typescript, Rust, etc.) and lets you manage your data model, views, etc. as 1st class programming constructs.
3 replies
0 recast
0 reaction

Ankur Goyal pfp
Ankur Goyal
@ankrgyl
Thoughts on GPT-3/LLM is a "better database" from someone who has worked on relational databases for over a decade and AI for half. tl;dr I think they have the potential to be (1/n)
1 reply
0 recast
0 reaction

Ankur Goyal pfp
Ankur Goyal
@ankrgyl
I can't find any libraries that let you run SQL queries on native datastructures (e.g. vectors). Am I missing something?
1 reply
0 recast
0 reaction

Ankur Goyal pfp
Ankur Goyal
@ankrgyl
Most exciting thing I've seen in a while! Instruction finetuning has the potential to turbocharge smaller, more practical models (namely T5) for lots of use cases that required big, proprietary GPT-3. https://arxiv.org/abs/2210.11416
0 reply
0 recast
0 reaction

Ankur Goyal pfp
Ankur Goyal
@ankrgyl
"AI is not good at math" argument seems pointless. Who cares? Humans suck at math too (w/out crazy Chain of Thought hula hoops in our brain). LLMs are great at generating code, which is what actually matters.
0 reply
0 recast
0 reaction

Ankur Goyal pfp
Ankur Goyal
@ankrgyl
"Compression" in the broad sense (reducing # parameters, smaller prompts, etc.) is one of the most important areas in LLMs right now. Smaller models => predictability, lower cost, openness, runtime performance, and developer productivity.
1 reply
0 recast
0 reaction

Ankur Goyal pfp
Ankur Goyal
@ankrgyl
Anyone gone deep with Tapas/Tapex/Tableformer? They look very neat but I can't see how they'd scale beyond a table with <= 512 cells.
0 reply
0 recast
0 reaction

Ankur Goyal pfp
Ankur Goyal
@ankrgyl
Is anyone thinking about the interaction of LLMs and compilers?
1 reply
0 recast
0 reaction