this post was submitted on 07 Nov 2023
142 points (81.7% liked)
Technology
68639 readers
3383 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I do a lot of writing of various kinds, and I could not disagree more strongly. Writing is a part of thinking. Thoughts are fuzzy, interconnected, nebulous things, impossible to communicate in their entirety. When you write, the real labor is converting that murky thought-stuff into something precise. It's not uncommon in writing to have an idea all at once that takes many hours and thousands of words to communicate. How is an LLM supposed to help you with that? The LLM doesn't know what's in your head; using it is diluting your thought with statistically generated bullshit. If what you're trying to communicate can withstand being diluted like that without losing value, then whatever it is probably isn't meaningfully worth reading. If you use LLMs to help you write stuff, you are wasting everyone else's time.
Amen. In fact, I wrote a whole thing about exactly this -- without an LLM! Like most things I write, it took me many hours and evolved many times, but I take pleasure in communicating something to the reader, in the same way that I take pleasure in learning interesting things reading other people's writing.
I have it read and review a couple paragraphs of a research article, many many times, to create a distribution of what was likely said in those paragraphs, in a tabular format. I'll also work with it to create an outline of an idea I'm working on to keep me focused, and help develop my research plan. I'll then ask it to drill down into each sub-point and give me granular points to focus on. Obviously, I'm steering, but its not too difficult to use it in such a way that it creates a scaffolding for you to work from.
If you aren't using LLMs to help you write stuff, you are wasting your own time.
I don't think that sounds like a good way to make a good paper that effectively communicates something complex, for the reasons in my previous comment.