Sept 2023, https://blog.sbensu.com/
For more like this one, see Friction logs
Summary
- Trustworthiness
- “Info quality may vary” is in every response, which helps me be cautious about facts and figures.
- Having external references for every sentence is very reassuring and their UI treatment didn’t get in the way.
- AFAICT, the system never hallucinated and that leaves me trusting it much more than I trust the other systems.
- They use a hybrid chat interface (direct answers with sources in one section, alternating with regular Google Search results).
- I never clicked on the Google Search results. It is much easier to know which source to drill down on than to consider the search results
- The chat interface implies that I can “talk to” this system but I can’t really. It always responds with the same information and doesn’t go deeper.
- Despite the chat interface, It still feels like Google Search.
- The UI had all the elements of a chat session but it didn’t communicate if it would be saved anywhere. I am low-key anxious about closing it.
- Their initial suggestions for how to use the app were more evocative than other similar products.
Stream of thought
Setup
I am coming in as somebody that:
- is interested in learning more about a subject but doesn’t have a perfectly crips agenda (“how would a dollarization in Argentina play out?”)
- knows introductory material about the subject but is not an expert
- is not looking to produce any specific artifacts (documents, presentations, summary, etc)
Start from Google Search with a question as usual:
https://lh7-us.googleusercontent.com/jJpHBPDd_ho9qs2oWZDGLVBTgMM7f6u8OobIX5FLWgYkm2w4RtrW6YWS_XVPBSwNQ8CerkpuKmB3yYqzwG9kADK-khzlwm32hsTtkIY4HdUleNApIA6dIX4D5Yu3DRWUpHO8PZBa0a7dm_UvLVmcc44
“AI-powered overview”? This doesn’t evoke the right feeling. “Get a quick answer generated by an AI?”
https://lh7-us.googleusercontent.com/tJk3aw2A1fE19a6JXM4jGWVOf4mXl7ziSGO7ecW_7oIuSJphcS_u8RCGcD6kbVowi5IALrv8LX27vkIX_EC8vV3EYrY2C7u9PIr9gt23o_S3xWTwKK3ZgRRtz0zruv_7STHG0xxTRbWJ499ApU9nW_o
“Info quality may vary” is an interesting way of caveating hallucinations. It doesn’t prepare me for “this might make things up”. But maybe this system doesn’t?