Content
@
0 reply
0 recast
0 reaction
rish
@rish
(niche feature, bit more iykyk) For those streaming raw events straight from a hub gRPC, you know the events need to be massaged and joined with other data before they can be used in your app. To solve for that, we now have a real time Kafka stream where developers can consume fully hydrated events and use them directly in their applications. Screenshot below shows raw hub event on the left and Kafka stream event on the right, you'll notice even the embed metadata is hydrated in Kafka. The Kafka pipeline stores the events for longer so if your app missed something, you can pick back up from where you left off. Overall, should make development a lot easier compared to wrangling raw hub streams. Reach out if interested. There is a node example you can plug and play to get started: https://github.com/neynarxyz/farcaster-examples/tree/main/neynar-webhook-kafka-consumer h/t @flashprofits.eth @shreyas-chorge for the work πͺ https://docs.neynar.com/docs/from-kafka-stream
6 replies
14 recasts
81 reactions
Matthew Fox π
@matthewfox
awh god damnit guess I have to stop making kafka jokes and actually learn what it is
0 reply
0 recast
4 reactions
Andrei O.
@andrei0x309
Looks inetersting for apps that process events. Would take some effort to convert from old data fromat that come from processing hub events, but peroformance should be better, after that in theory.
0 reply
0 recast
1 reaction
downshift
@downshift.eth
awesome work π
0 reply
0 recast
1 reaction
Jai
@jaivsdawrld
Basically just helps in streamlining my app development. Iβll reach out !
0 reply
0 recast
2 reactions
Ryan J. Shaw
@rjs
Wow! This might be what I've been dreaming about building, need to dig in! π«‘
0 reply
0 recast
1 reaction
law
@traguy.eth
Okay i read this over and over and have just one question How does the Kafka stream event differ from the raw hub event⦠And what benefits does the Kafka stream provide for developers?
0 reply
0 recast
0 reaction