Ok I admit it, I’ve been using ChatGPT to reply to people on Twitter, emails, substacks as a way to warm up potential leads. It significantly decreases my cognitive load and gives pretty decent responses. I didn’t realize it was that big of a deal until today. Someone thought even farther and made a Twitter bot.
Using ChatGPT to reply to people can be a useful tool for generating responses to queries or prompts in a chat setting, but it is important to keep in mind that ChatGPT is a machine learning model and not a human. As such, it may not always produce responses that are appropriate or accurate, and it is not capable of understanding the nuances and subtleties of human communication.
In a community setting, it is generally best to rely on actual human communication and interaction, rather than relying on a machine learning model to generate responses. While ChatGPT may be able to generate responses that are somewhat similar to what a human might say, it is not capable of understanding the context or purpose of the conversation, and it may not always produce responses that are appropriate or respectful.
In summary, while ChatGPT can be a useful tool for generating responses to prompts, it is generally not a good idea to rely on it as a primary means of communication in a community setting.
I thought I’d get the above ChatGPT response out the way first.
I’m not against it, but as with anything, using it robotically doesn’t feel great. Using it as an extra source of help, as a guide, as a quick search for options, it can be helpful.
There were two short paragraphs within an article I wrote recently, it was helpful when I was stuck.
I’ve also just gotten access to NotionAI and having been using Lex in small doses. I haven’t been impressed with other AI writing tools, most have been a waste of time.
AI and writing would be a good topic for the next Forum Collective?
What’s also fun is that relying too much on AI makes you stop thinking by yourself, and it reminded me of an episode of Wall-E, where people did not know how to walk
I’m not sure where AI will land. Right now there are plenty of free tools, but these are costing $millions to run, so at some point the free lunch will be over and you’ll need to pay, which will likely mean it’ll price most out of frivolous work.
I don’t think it should be encouraged to be used to reply to community content as it breaks the art of relationship building and if enough do it, your community may end up being nothing more than AIs talking to each other.
I suspect tools to combat AI submissions will be built.
AI writing tools are good to get past writers block, or to get a draft for the opening paragraph, and even potential layouts but given AI is trained on search engine content, and the most readily available content has been heavily optimised, it seems to give fairly bland, high level and generic replies without any real insight or depth.
It explains what flowers are, but doesn’t evoke the scent and beauty to the reader.
Actually because of its immense semantic capacity, because of its ability to search for similarities and approximations in a massive knowledge base… I am inclined to think that it can actually help you to think, to look for categories, to disaggregate and organize ideas.
He found that it was great for replying to questions that you could google, but didn’t work as well on questions looking for peoples experiences/thoughts etc.
I think this is key here. For me a community is much more than a knowledge share (although that is a key component) It’s about developing an environment for psychological safety to allow for experiences to be shared, understood and learned from. Tools like ChatGPT I think come into the mix when looking for inspiration, and as in some of the examples above, to help with the time suck tasks required to run/manage and understand our communities.
I do worry however that AI content could have a negative impact to community engagement. For a lot of people community is about that very thing, real people, real experience and a real voice.