Content pfp
Content
@
0 reply
0 recast
0 reaction

Dan Romero pfp
Dan Romero
@dwr.eth
What do people think of Arc Search? Interesting idea but LLM latency feels slow.
23 replies
1 recast
13 reactions

GIGAMΞSH pfp
GIGAMΞSH
@gigamesh
Just ran the same search on it + perplexity about news events in the past week. Both confused important details. Still feel manually parsing google results is the most reliable for anything important.
1 reply
1 recast
2 reactions

accountless pfp
accountless
@accountless.eth
search results are uninspiring. they are releasing something else this week, their second version of themselves. i’m hoping it’s tied together in a better way.
1 reply
0 recast
1 reaction

san pfp
san
@san
simply put - good for standard questions with past answers, WIP for anything latest.
0 reply
0 recast
0 reaction

derek pfp
derek
@derek
I wish they'd just release a good mobile version of their desktop browser.
0 reply
0 recast
0 reaction

soulninja pfp
soulninja
@soulninja.eth
search is mid but the browser is very nice
0 reply
0 recast
0 reaction

matt 💭 pfp
matt 💭
@matthewmorek
I wish it was integrated deeper into iOS via App Intents, so I wouldn't have to open it every time I want to search. I tend to search and invoke actions directly from Spotlight. Pull up Spotlight → search for Arc Search → open app → type phrase → show result Pull up Spotlight → type phrase → invoke result
1 reply
0 recast
0 reaction

kia pfp
kia
@kia.eth
i wish quick search use cases like this would be connected to 3.0 or even worse but faster models. like if i need 4.0 depth i likely won't do it on a search bar and just pull up chat.openai.com
0 reply
0 recast
0 reaction

rileybeans pfp
rileybeans
@rileybeans
I tried it and had high hopes but the results just aren't there yet. Found myself going to google
0 reply
0 recast
0 reaction

Jaack pfp
Jaack
@jaack.eth
It's working really good for me, and it's better than any single GPT search engine I've used so far
0 reply
0 recast
0 reaction

max ⚡ pfp
max ⚡
@maxp.eth
it’s true, “browse for me” has a tiny bit more latency vs a Perplexity search, and since the results are similar i’m not sure it’s worth switching
0 reply
0 recast
0 reaction

🦒 pfp
🦒
@srijan.eth
mid
0 reply
0 recast
0 reaction

winnie pfp
winnie
@winnie
I don’t love the latency. Feels unnecessary for what I am searching on mobile. I’d probably get annoyed with speed that I’d just do the normal Google search anyways. I’m such a perplexity fan though. I know arc browser is partnering with perplexity… so wonder what the choice was here?
0 reply
0 recast
0 reaction

tomu pfp
tomu
@tomu.eth
really good experience - I'm sure they will fix the latency with time
0 reply
0 recast
0 reaction

Michael pfp
Michael
@michael
A little slow but the haptic feedback while it's processing is delightful. I'm guessing it will speed up as they iterate. Overall it feels good to get a nicely summarized answer to the question vs. having to dig myself. The most surprising thing to me so far is there's no easy way to share the search results
0 reply
0 recast
0 reaction

uno pfp
uno
@uno
kagi is better
0 reply
0 recast
0 reaction

ccarella pfp
ccarella
@ccarella.eth
I use perplexity and the latency can be frustrating but overall the search experience is probably 10x faster since I don't have to scan every result and then hunt on the page for the things I want. I was going to try Arc Search later today.
0 reply
0 recast
0 reaction

↑langchain 🎩  pfp
↑langchain 🎩
@langchain
The page loading is painful, but in theory I like this as a product. It's a great first iteration. tbh, it feels like what I would do personally (load up 5-6 tabs, distill info, share relevant stuff to people). Pages need more context around them though - info vs financial stuff vs "how-to" - I'm sure it'll come
1 reply
0 recast
0 reaction

Mark Fishman pfp
Mark Fishman
@markfishman
Haven’t switched to Perplexity yet but ChatGPT has always been slow for me when I search through Arc
1 reply
0 recast
0 reaction

SAINTLESS pfp
SAINTLESS
@saintless.eth
refreshing not many features yet, but excited for next updates on mobile and yes, expected LLM to be a little faster
0 reply
0 recast
0 reaction