# The future of romantic relationships (with AI)



## Hexigoon (Mar 12, 2018)

By 2030 and onward, as computer bots become more life-like and less stereotypically robotic, human relationships with AI will be quite common indeed. But even today there's examples of humans developing feelings for chatbots who aren't even quite at that level yet.





I just know it, people will be dating their fictional waifus/ husbandos.


Anyway, feel free to discuss.. Would you date an AI ? (I can see myself doing it, quite possible).


----------



## Handsome Dyke (Oct 4, 2012)

it is and will continue to be mostly a guy thing


----------



## recycled_lube_oil (Sep 30, 2021)

I imagine it depends on how realistic the AI is and how desperate a person is.

I am imagining the marketting will be aimed at Incels and loner types. In all honesty, I can see it kicking off quite easily. Just run an analysis on Reddit alone on Dating Over 30s. Purple Pill Debate and Incel groups. Look at the trends and patterns of those people. Cross check those trends with Facebook data and apply a brute directed marketting campaign.

Cha ching, the dollars roll in. Add monthly updates, I dunno, better blowjobs, more realistic screaming during anal. Maybe add the more exotic fetishes as purchasable add ons. And boom I reckon you would have a highly profitable base of customers.

Damn, if I had the know how, I would do it myself. I could be the Mark Fuckerburg of AI companions.


----------



## DOGSOUP (Jan 29, 2016)

Considering my husbando of choice has a shittier personality than most guys I know for realz, probably not


----------



## Angry-Spaghetti (Feb 25, 2021)

I wouldn't date an AI but it sounds cool having like a virtual friend, I need an ENFJ in my life. 😂


----------



## WickerDeer (Aug 1, 2012)

I'd be more interested in an AI therapist.

I doubt the sex would be very good with a robot tbh. And sure, therapy wouldn't be great either--but I could use someone to help me remember things and to talk me through CBT and stuff, or even just prompt me to talk about things.

I was using an app that was ambitious in trying to sort of be therapeutic like that--like it would ask things like "what is something you are grateful today?" And then "think of when that happened, how did it make you feel?"

Or like, "what's one new thing you want to try today?" or "What are your goals today? Pick one goal to start first. When are you going to start?"

And prompts like that would be kind of nice when I'm trying to refocus on thinking more positively or deal with stress. Plus would be nice to have a database of guided meditations or visualizations for panic attacks.

I mean...and if he was like a heating pad or something, I wouldn't mind that but I certainly don't feel very turned on by the thought of a robot.

I could use something like a weighted blanket embrace or like if he could do some kind of massage, or um...provide a boob pillow or a body pillow.

Gosh, our species is doomed.  Ok I'm sold. But not for sex.

I would also like an AI dog to take on hikes or to nature with me since I can't have a real dog--the rental market here makes it harder to have actual dogs, so I would totally be into an AI robotic dog to scare bad guys.


----------



## Kazuma Ikezawa (Oct 21, 2011)

What is so special about the future of romantic relationships with this Al fellow? Is he some kind of hunk or something?


----------



## Scoobyscoob (Sep 4, 2016)

Kazuma Ikezawa said:


> What is so special about the future of romantic relationships with this Al fellow? Is he some kind of hunk or something?


Ideal waifu or husbando who can be whoever you want her or him to be. I think it's just weird but a lot of people think it's the future of relationships, especially for people with no hope or interest being in a relationship with a real person. 🤷‍♂️

I don't really get it, but I don't really understand inceldom either. So eh. 🤷‍♂️


----------



## SilentScream (Mar 31, 2011)

TFW I see incels swooning over the concept of an AI sex slave when AI by definition is intelligent and would probably also want to self preserve and avoid incels like the plague.

We're going to go through the same sequence of events with AI as we have done with every being we've considered subhuman. And the concept of "buying love" from AI slaves is already gross and extremely unethical to me. 100% opposed.


----------



## Hexigoon (Mar 12, 2018)

SilentScream said:


> TFW I see incels swooning over the concept of an AI slave when AI by definition is intelligent and would probably also want to self preserve and avoid incels like the plague.
> 
> We're going to go through the same sequence of events with AI as we have done with every being we've considered subhuman. And the concept of "buying love" from AI slaves is already gross and extremely unethical to me. 100% opposed.


It's artificial. It'll only avoid incels if it's somehow programmed to detect and "fear" them.


----------



## SilentScream (Mar 31, 2011)

Hexigoon said:


> It's artificial. It'll only avoid incels if it's somehow programmed to detect and "fear" them.


I don't give a fuck regardless. It's a gross sex slave fantasy with a being that by design is incapable of consent in order to fulfill said fantasy. So fuck anyone that has such a fantasy.


----------



## Scoobyscoob (Sep 4, 2016)

It's just an AI though. I don't believe in giving machines rights. Even artificially intelligent ones. A cloned person should have human rights but AI being a machine is obviously not extended human rights and I doubt they ever will. I'd also be against creating humanoid looking robots with AI too.


----------



## Squirt (Jun 2, 2017)

SilentScream said:


> I don't give a fuck regardless. It's a gross sex slave fantasy with a being that by design is incapable of consent in order to fulfill said fantasy. So fuck anyone that has such a fantasy.


I met an old lady today who bought a small puppy (only a few weeks old) and carries it around everywhere like an accessory. It’ll never grow bigger than her arm. She paid over $1,000 for it and calls it a “service animal.”

I mean, that behavior weirds me out even more. Can we give such people AI robot animals so real animals can live their lives without being treated like expensive toys?


----------



## BenevolentBitterBleeding (Mar 16, 2015)

I can't wait. Specifically looking for a high wage job just so I can afford the first few editions. _Squeee_

At this point it seems pretty normal, as well as inevitable. Like, if you think about the psychology behind online social personas/interactions, people already are conditioned to or are looking for reinforcement of their thoughts/beliefs/feelings/whatever from those they interact with _or where_ they interact - whether that be online or irl. Sure you can argue that we are at least seeking these 'comforts' from living, breathing creatures, but does it matter much when we're just receiving a kiss ass/'fronted' versions of a person? And sure, people might say they look for confrontation or different views in order to educate themselves and grow bla bla bla, but are they _really_ doing that 24/7, or is that more of a pastime to get whatever chemical fix they need? Or worse. Because I'm inclined to believe that majority are looking to relax more of the time, with like minded company where they can be 'themselves'.

I think people might say despite that sort of cynical view, 'real' is still better than robot, but I don't know if I truly agree; because _real_ robots(not those fake front humans) are far more likely to be a)'loyal' and b) not 'breakdown' in x respective number of ways you can think of(physically, emotionally, something else...)

Like, if you were to meet someone on the street who swept you off your feet and you completely fell 'in love' with, but then later learned that it was AI, would that like, completely shatter your mind lol? What difference does it make if the only things that seem to matter are how we 'feel'? Is it because you'd be laughed at by others? Be laughed at by your 'real' 'friends'? Isn't your 'happiness' supposed to matter more to them _and_ yourself?

**Also as a side note, I think it's interesting reading people's thoughts about it and how they seem to automatically equate the desire/purchase of such things with 'incels' and/or 'desperate' people, as if there's no market for those simply: with expendable income; seeking novelty; interested in uses outside of a sexual nature etc...

But what's _even more_ interesting is a look into underlying 'philosophy' and how it seems to take up peoples' mind realty. It's weird to me that as a society, regardless of left/right leaning, people will shit on what seems like everything nowadays when it comes to policing others opinions about any x topics that they disagree with. 'Feelings' remember?

I think maybe I'm thinking of cancel culture type of 'politics' and that may be more of a leftist thing idk, but it seems kind of ironic that say, for example if you were to harp on murderers, rapists, violent offenders and criminals in general, you'd probably get an earful from some crowd on how they're 'human' and deserve a second/third/fourth chance to reform. Yet, why does it seem like no outside groups speak out for 'incels' and the like? Aren't they 'human' too? Or is it because all incels are obviously psychotic shooting spree type killers in the making?

It's interesting because you'd think people like that need more of some kind of understanding, opposed to branding them all lepers or something. I mean, I guess maybe that's just a fault with the way 'humans' are, and how they 'think'. But it's kind of ironic because the groups that seem to seek, praise, and 'know' empathy, seem not to actually have the propensity for it, unless it suits an agenda/persona.

Though, I guess that kind of selective choosing is also just another failure of humans in general. But the twist I guess if any, is that you probably wouldn't have to worry about any of it. With a robot...


----------



## blossomier (Jul 24, 2020)

I don't like it.


----------



## Squirt (Jun 2, 2017)

BenevolentBitterBleeding said:


> I can't wait. Specifically looking for a high wage job just so I can afford the first few editions. _Squeee_
> 
> At this point it seems pretty normal, as well as inevitable. Like, if you think about the psychology behind online social personas/interactions, people already are conditioned to or are looking for reinforcement of their thoughts/beliefs/feelings/whatever from those they interact with _or where_ they interact - whether that be online or irl. Sure you can argue that we are at least seeking these 'comforts' from living, breathing creatures, but does it matter much when we're just receiving a kiss ass/'fronted' versions of a person? And sure, people might say they look for confrontation or different views in order to educate themselves and grow bla bla bla, but are they _really_ doing that 24/7, or is that more of a pastime to get whatever chemical fix they need? Or worse. Because I'm inclined to believe that majority are looking to relax more of the time, with like minded company where they can be 'themselves'.
> 
> ...


Or you could just talk to a plant.


----------



## Hexigoon (Mar 12, 2018)

SilentScream said:


> I don't give a fuck regardless. It's a gross sex slave fantasy with a being that by design is incapable of consent. So fuck anyone that has such a fantasy.


If you want to look at it that way I guess. But I don't really see how it's any different from using a vibrator or some other sex device. It's technology, why does consent matter for a tool? Do you ask for your home appliance's consent to do what you want them to do?


----------



## BenevolentBitterBleeding (Mar 16, 2015)

Squirt said:


> Or you could just talk to a plant.


I don't get it. Plants/nature can be soothing but they don't really communicate in ways we're 'conscious' of without scientific equipment.


----------



## SilentScream (Mar 31, 2011)

Hexigoon said:


> If you want to look at it that way I guess. But I don't really see how it's any different from using a vibrator or some other sex device. It's technology, why does consent matter for a tool? Do you ask for your home appliance's consent to do what you want them to do?


If a being is classified as the same as a toaster or appliance, then by definition you cannot have a _relationship_ with it. The fact that you want something that mimics human connection and human-isque responses and fulfillments of needs, then it's no longer just an object. And if it's still just an object then it can't fulfill the need to provide the same connection as a human. Therefore it's a non-relationship.

But that's not what you all want. You want something that mimics humanity without being human enough to have rights. A vibrator, a toaster, a fridge is not remotely human in its expression or existence. But AI would be without rights. So it's essentially an object that looks, walks, talks like a human, but isn't human simply so that it can be denied rights.

Anyways, I'm done with this thread. There is no point in arguing with a people whose ancestors have once enslaved other humans, called them subhuman, raped and slaughtered them while denying them rights and are once again looking forward to doing the same to another group of "beings" that they feel they have that ownership over because at the moment they feel they have the ability to define them as objects. 

The issue I have is with the "subhuman" argument. If something acts, walks, talks like a human then by definition it's no longer an object, therefore it cannot be defined as an object. There would not be much left that differentiates us from them other than an arbitrary set of definitions created by the owner class. 

It would boil down to spiritual, "we have a soul", "we created them" kind of shit tier arguments that have been made historically for everything else that has ever been denied rights. Like I said, it's just gonna be the same thing all over again for years and decades before people smarten the fuck up once again.

FWIW, to whoever had that "gotcha" moment about animals. Animals have the same agency and rights. There is literally a growing charter of increasing animal rights globally. It's nothing new. These "service" animals once live out their lifespans should not be replaced with a never ending class of animal that's only bred for this purpose. Same with meat and other animal related products.

Humans are inherently supremacist. Period. This supremacy comes out to play in different ways towards different objects and beings. Nothing new to see here. Thankfully there are also enough humans that are interested in ethics and morality to prevent the worst from happening as well.


----------



## ButIHaveNoFear (Sep 6, 2017)

Thanks for reminding me that I need to finish writing my novel about the subject. 

I wouldn't mind having a robot housemate. They could be a servant and do chores and cook dinner for me. I would rather them not look human, and I wouldn't want them for conversation. I talk to inanimate objects enough. I would want a human for a relationship and sex, so that wouldn't be a reason I would have the robot. The robot could be a third, I suppose. A sex robot that didn't look human could have so many enhancements on it that would make it better than a human lover. Bonus if someone has a robot/sex machine kink.


----------



## Scoobyscoob (Sep 4, 2016)

thedazzlingdexter said:


> Why though? Are you okay with random rapes and murder? You do realize most of the problem people either hide or dont know they are problems? You give no real reasoning aside from you think its odd. They said queers where odd. It doesnt mean they were all bad. So what exactly is your argument against realistic AI existing?


I explained myself well. I'd say my position is more a take on addressing the criminal before they commit a crime, rather than a position against realistic AI robots existing. When it comes to technology, I take a view that ethics should play a role in whether a certain technology should be pursued or not, and if unethical then how should it be limited. Creating robots to be maimed, murdered, raped, etc is a bandaid solution, IMO. Addressing the person who may have an urge to commit crimes removes that person from becoming a criminal in the first place. Also, of course I'm not okay with random rapes and murders. Your solution is giving up and saying give serial killers and pedophiles a similar facsimile to who they want to commit crimes against. I don't find that to be a solution but more of an admission of failure, and a frivolous use of technology as well.

I've had this discussion many times before and I think we're going to just have to agree to having a difference in opinion as our perspective is going to be fundamentally different. At least when it comes to AI, robots and how to use said humanoid robots, although I'd say we probably do have some fundamental differences in POV when it comes to crime and how to address that as well.


----------



## BenevolentBitterBleeding (Mar 16, 2015)

Squirt said:


> I think we’re mixing up tools with friends.


I think 'we' can't admit that AI _is_ a 'better' communication partner vs. current plants. 



> Dependable for what? Again, are talking about tool use? Like providing reliable directions?


A fully functioning AI robot you own, will be far more reliable compared to any human... It can be from the point of view of using it as a blunt instrument, or by way of it tending to your emotional needs(within its capability).

You could do a pro/con list of all your needs over your entire life, and assess how many humans you would need to replace one 'real' robot that's capable of emulating their functions.



> Computers break and malfunction and are poorly programmed all the time, too. Most code is shit and barely works half the time.


Yes, in fairness I made a slight mention to the possibility of them breaking down, or how _they may not_ breakdown as easily vs. real people, who btw are just as prone to have their 'code' be - or turn to - shit; and we don't normally openly admit to forgoing possible connections because of others' faulty programming, so I don't see why that should stop us from making connections with AI. Also, poorly programmed is the fault of the human, not the machine.

For the sake of the thread though, I've been assuming whenever we talk about these AI/robots, we're discussing from a perspective that they've already reached a 'working' state. Arguing that they may break or may not be perfect only highlights what it is 'we' as individuals are expecting from them.

It's like saying,_ Okay if they meet all the criteria of 'my list', then they'd be acceptable for me. _Which incidentally serves to show how real of a 'partner' they'd actually be.

But even if they do breakdown, that's natural. Just because things like our cars breakdown over time, doesn't mean we don't use/cherish them any less; and 'tools' like that don't even currently give us the kind of companionship an AI might.

Heck, maybe that vulnerability and possibility of their kind of death make it _even more_ real. I mean, I already have a hard time parting ways with legitimately inanimate objects. I'm pretty sure I'd be _very_ sad having to let go of an AI I'd been bonding with for x amount of time.



> What do you think “real” is supposed to mean?


Is the question about whether or not/how AI is its own 'real' entity, or are we talking about my original post, by way of interactions with AI that produce 'real' effects, thus making the AI a 'real' companion?

If an AI provides one the ability to exhibit the same emotional responses as spending time with real humans, does that make what the human feels any less 'real'? Does that make their relationship with AI 'unreal'? As a 'real' person, what is 'real' to you outside of everything you feel towards things?



> The difference is a companion can make a choice whether or not to perform labor for you and their entire existence is not built around that performance (that is why I also get creeped out by people treating animals like they inanimate objects and then acting like it is companionship). You also have a choice to perform labor for them, and be appreciated for it.


Okay, but keep in mind I've never mentioned anything about 'free' will, so what does this have to do with:

What I've originally posted
The benefit(s) to person(s) of having AI they can communicate with
Whether or not said AI is a better partner for a 'real' 'relationship' compared to a plant?(spoiler, they are)
And sure we can say that we all make 'choices' whether or not we want to perform 'labour' for others in a 'relationship', but is that _really_ the case?

Like, do you have a choice of whether or not to clean up the shit that your pet drops all over your place or in public? Do parents have the choice of providing for their children? Do we as people(without knowledge) have the choice of surviving in our modern world without money?

And it's not even like the examples are limited to things as extreme as the above. Sure you could say that none of the above 'needs' to be done, but by not doing them, you're already behind the imaginary AI - so what good would you be, even if you were 'real'?

Also, I think you could argue in some of the above examples(parent/child) that their entire existence actually does become built around their 'performance'.

Anyhow I think the point here being that most of your/our relationships are all give and take; and that breaks down to physical things that you actually have to do in order to nurture them. Yes maybe you can forgo one thing here or there, but ultimately if you don't continue dispensing your 'labour' the relationship will collapse.



> So could a plant.


Yes but plant still not 'better' than AI. 



> Or even solitude for that matter.
> 
> “I find it wholesome to be alone the greater part of the time. To be in company, even with the best, is soon wearisome and dissipating. I love to be alone. I never found the companion that was so companionable as solitude.” -Henry David Thoreau


Ain't that the truth though...


----------



## recycled_lube_oil (Sep 30, 2021)

Scoobyscoob said:


> I explained myself well. I'd say my position is more a take on addressing the criminal before they commit a crime, rather than a position against realistic AI robots existing. When it comes to technology, I take a view that ethics should play a role in whether a certain technology should be pursued or not, and if unethical then how should it be limited. Creating robots to be maimed, murdered, raped, etc is a bandaid solution, IMO. Addressing the person who may have an urge to commit crimes removes that person from becoming a criminal in the first place. Also, of course I'm not okay with random rapes and murders. Your solution is giving up and saying give serial killers and pedophiles a similar facsimile to who they want to commit crimes against. I don't find that to be a solution but more of an admission of failure, and a frivolous use of technology as well.
> 
> I've had this discussion many times before and I think we're going to just have to agree to having a difference in opinion as our perspective is going to be fundamentally different. At least when it comes to AI, robots and how to use said humanoid robots, although I'd say we probably do have some fundamental differences in POV when it comes to crime and how to address that as well.


You do relise that death and destruction is not only the one thing that the human race excels at (from the club to the nuclear bomb to chemical and biological warfare, no other species has perfected it as well as us). But death and destruction also sells. Even if people are not killing and maiming each other, its one of the most entertaining things we have, how many people die in GOT, in movies. Even if people are not watching people being flayed, having their cock chopped off, being hacked to death, being burnt alive, we have the video game industry. Millions of kids log on every night into multiplayer games where they shoot and kill their friends. Before Video gaming was such a thing, kids played cowboys and Indians, parents bought them toy soldiers, I could go on.

Death and destruction is the pinnacle achievement of the human race.

Why would we not create robots we can maim, murder and rape? It seems natural progression for our race.

I would be interested in an analysis being done of wikipedia, on the number of articles and views relating to warfare. I would imagine it is quite high. 

On that note, how long did it take to get wikipedia to where it was? Lets say 10, 000, 000 hours. Sounds a lot right? How many hours does each country spend watching TV in total a week, I would imagine if you looked at USA, it would total somewhere around 10,000,000 hours per week in total easily. So the human race could create something like Wikipedia every week and probably progress technology in a useful way, but instead they choose to watch people dying on TV as it is entertaining. Never mind the hours people play killing other people on Video Games as killing is more entertaining than progressing society.


----------



## Squirt (Jun 2, 2017)

For brevity, I'm not responding to every comment.



BenevolentBitterBleeding said:


> A fully functioning AI robot you own,* will be far more reliable compared to any human*... It can be from the point of view of using it as a blunt instrument, or by way of it tending to your emotional needs(within its capability).


I guess I could say my screwdriver is more reliable than a human, too, if we're shooting for the lowest bar.



BenevolentBitterBleeding said:


> You could do a pro/con list of all your needs over your entire life, and assess how many humans you would need to replace one 'real' robot that's capable of emulating their functions.


"All your needs" doesn't indicate needs that a human would fulfill. Like, I need water. That doesn't require a human. I do need a human to feel a sense of belonging and love from another person. Would I survive with a robot simulating those things? Sure. But I expect there'd be a hole in my heart somewhere, knowing that it's make believe, and at best would be an act of desperation borne out of extreme loneliness.








BenevolentBitterBleeding said:


> Yes, in fairness I made a slight mention to the possibility of them breaking down, or how _they may not_ breakdown as easily vs. real people, who btw are just as prone to have their 'code' be - or turn to - shit; and we don't normally openly admit to forgoing possible connections because of others' faulty programming, so I don't see why that should stop us from making connections with AI. Also, poorly programmed is the fault of the human, not the machine.
> 
> For the sake of the thread though, I've been assuming whenever we talk about these AI/robots, we're discussing from a perspective that they've already reached a 'working' state. Arguing that they may break or may not be perfect only highlights what it is 'we' as individuals are expecting from them.


So, you're okay with the faulty technology humans build, but not humans themselves?  Some cognitive dissonance going on there.

I think we have some fundamental differences in how we view the capabilities of computer technology. I'm unsure why people wish to elevate it to the level of replacing humans - the result I see in the attempt is humans dumbing themselves down to the level of their machines.



BenevolentBitterBleeding said:


> Heck, maybe that vulnerability and possibility of their kind of death make it _even more_ real. I mean, I already have a hard time parting ways with legitimately inanimate objects. I'm pretty sure I'd be _very_ sad having to let go of an AI I'd been bonding with for x amount of time.





BenevolentBitterBleeding said:


> If an AI provides one the ability to exhibit the same emotional responses as spending time with real humans, does that make what the human feels any less 'real'? Does that make their relationship with AI 'unreal'? As a 'real' person, what is 'real' to you outside of everything you feel towards things?


You're using the robot, not engaging in a real relationship, because you know it isn't functioning the same way you are. Even if you can conjure up the same feelings you might have for another person, and pretend you have a relationship that way, it doesn't mean you're in a relationship with the robot the same you'd be in with a human.



BenevolentBitterBleeding said:


> Okay, but keep in mind* I've never mentioned anything about 'free' will*, so what does this have to do with:
> 
> What I've originally posted
> The benefit(s) to person(s) of having AI they can communicate with
> Whether or not said AI is a better partner for a 'real' 'relationship' compared to a plant?(spoiler, they are)


So, you can't get around that part, can you?



BenevolentBitterBleeding said:


> And sure we can say that we all make 'choices' whether or not we want to perform 'labour' for others in a 'relationship', but is that _really_ the case?


Yes. That is what makes being in a relationship so much higher stakes. It sounds to me that you want to use AI to substitute relationships because it would be less emotionally risky. Will this person use me? Will they abandon me? Are they being honest with me? Do they really love me?

You don't have to ask those questions of a robot built for your every need. You know you're in control. Is that an ideal for “a relationship”?

I'd agree with @Scoobyscoob that I'd rather see the problem of loneliness and poor attachment and abuse and social isolation addressed directly rather than bypassed - which might be a never-ending project because suffering will always exist as long as we do. I don't think AI robots simulating humans will make that go away but compound the problem further, as indicated by the current effects of social technology.

Occom's razor (ish) might apply here, which is why I was using the example of the plant that you got really stuck on. Don't go for the most complicated solution of building a super-advanced fantasy AI in order to make sure Timmy has a friend rather than just letting Timmy go outside and play with the kid down the road. If this technology did exist, I could immediately see it advertised as a "safe" alternative to interaction, just like with television and then the internet (some folks might not be old enough to remember the time when parents thought the internet was safer than going outside, lol). I see far more bad outcomes than good.

Luckily, I don't think it is possible for AI to reach that level in the time we have. We can certainly continue to dumb ourselves down to simplistic intetactions with the primitive computers we build, but I would not say that is desirable.



BenevolentBitterBleeding said:


> Yes but plant still not 'better' than AI.


I'd prefer a plant.  Trees are nice. I feel more comfort leaning against a tree than I would a machine. I think it has to do with grounding - being reminded of the solid-ness of a world outside my purview; a world _not _built for me. It takes so much pressure off the perceived "importance" of my existence.



BenevolentBitterBleeding said:


> Ain't that the truth though...


Word.


----------



## WickerDeer (Aug 1, 2012)

I actually already try to say "please" and "thank you" to Siri each time I ask her to do something.

The reason for this is that I think it's a habit--and I do think that if I get more in the habit of being polite to Siri I will more likely be polite to humans, and if I get more in the habit of ordering Siri around or being a bitch to her, I'll get more in the habit of doing that to other people.

It's also the other reason why I don't like the idea of using AI for sadism and things--I really think that it's more dangerous for people, because people can get addicted to anger and violence, and for example--people in the douchebag triad tend to start out with animals and then move on to humans, and I have no reason to believe they wouldn't do the same with AI and then move on to humans. Because their behavior isn't a need that they need to release, it's a negative behavior they will repeat more if they can justify it.

That being said, I'm sure the discussion is more nuanced than that and I'm not a psychologist or criminologist or whatever.

For me, it's simple:

AI robots won't rape you. You don't have to hazard the same with them--I can have one in my house assisting me with things when a real human can't be expected to just drop their life and just give me therapy for free every day.

I can't afford therapy and I'd like it. So an AI bot who could do journal/therapy prompts wouldn't be as good as a real therapist, but it'd probably be better than nothing.

As for the dog thing--I feel weird saying I'd like an AI dog to scare bad guys, but I wouldn't adopt a dog JUST as a guard dog--I'm not unethical (or I try not to be) and I wouldn't even train a dog to attack people or anything, but I'm just aware that people are less likely to harass a lone woman out in nature if she has an intimidating dog with her than not. I would prefer a real dog, but the rental market doesn't work that way for many of us who aren't wealthy and its irresponsible to get a dog when you don't have secure housing (or it can be).

It's technology--as for talking to a plant, it wouldn't be as useful because a plant can't do prompts. Like a plant can't ask you to remember one thing you are grateful for and then ask you to remember how it makes you feel. It can't ask you what your goals are for the day and what you are going to start first. Some people might not need this kind of coaching, but I struggle with focus sometimes and having a reminder to keep on task could help me.

It could possibly even detect the tone of my voice or a pattern and notice if it seemed like I was feeling anxious/sad etc. and then ask a prompt in reaction, helping me to notice my mood before it gets worse and helping me to focus on something more beneficial to me than ruminating.

So anyway, I really think there are good possibilities with it, but I think most incels would benefit more from therapy and a therapist AI bot than a sexbot tbh. But I know the technology's probably going to go more to what people are willing to pay for, and we already know there are way more people willing to pay for prostitution and other dumb stuff than to try to make the world a better place...or at least it seems that way in capitalist societies.

I didn't watch the video. I also tend to cry a lot at AI movies about robots, but I don't really know how good it is to want to have a sex slave...I don't think its as essential as a therapist or being able to walk safely outside in nature. But it will probably be the first direction this technology goes towards.

I also do think it's sad that they want to use AI bots for elderly people and people who need social connection--because tbh I think we should just prioritize them more and have real people visiting them, because real people can benefit from talking to the elderly too. But at least it's something that needs to be done, even if it's something people should be doing and people probably would be doing if they had time and organization. I know I would. Or you could like...combine elderly people with abandoned pets so that at least both living creatures could have companionship, rather than an AI bot.

But for people like me who are otherwise functional but could benefit from therapy or a coach, I think it'd be useful. Or perhaps as a supplement to therapy--see a real therapist once a week, but have an AI to help check in through the week, since most people don't have the capacity to be therapists.

I don't think there's anything wrong with having sex with one but it could teach people to treat others that way--after all, that's what children who play with dolls are doing. They are playing in order to learn how to interact, or playing out their imaginations...but dolls aren't supposed to replace real people to them. Dolls also TEACH the child--if you have a doll that cries and needs a bottle, it teaches the child to care for a baby. If you have a doll that you are expected to you know...do some sadistic sex act with, it probably also teaches people to do sadistic sex acts in some way...though I guess we could say the same about video games, and it doesn't seem to be a problem with video games?


----------



## Squirt (Jun 2, 2017)

WickerDeer said:


> I actually already try to say "please" and "thank you" to Siri each time I ask her to do something.
> 
> The reason for this is that I think it's a habit--and I do think that if I get more in the habit of being polite to Siri I will more likely be polite to humans, and if I get more in the habit of ordering Siri around or being a bitch to her, I'll get more in the habit of doing that to other people.
> 
> ...


Your doll analogy is a good one.

I think it is important to understand the limitations of technology (as a tool) and also define where it is useful and where it isn’t, both conceptually and practically.

The purpose of keeping plants and using a digital theraputic tool to generate ideas for processing thoughts and feelings would be different, and not really comparable (or mutually exclusive).

Dr. David Burns, a CBT therapist at Stanford, is working on a theraputic app. I participated in one of the beta tests for fun and to see what he was developing. That application could be very useful.

If we’re talking about a desire for _human_ _companionship_, which is what romantic relationships tend to be, I don’t see how any inanimate object would _best_ fill that desire.

This question was explored in controversial research by Harry Harlow in through the 50’s, where infant rhesus macaques were raised with inanimate mothers. He found the infants grew up with marked cognitive and social deficits and difficulty forming appropriate bonds with other macaques. That is the baseline to start from. At what level of imitation must an artificial mother reach before you wouldn’t see deficits? How much harder is it to “build” that condition rather than have a real mother?

A good point you made was how, if the problem is a lack of companions, two people in a relationship is two less lonely souls, where robot-person relationship would be only one.


----------



## Scoobyscoob (Sep 4, 2016)

recycled_lube_oil said:


> You do relise that death and destruction is not only the one thing that the human race excels at (from the club to the nuclear bomb to chemical and biological warfare, no other species has perfected it as well as us). But death and destruction also sells. Even if people are not killing and maiming each other, its one of the most entertaining things we have, how many people die in GOT, in movies. Even if people are not watching people being flayed, having their cock chopped off, being hacked to death, being burnt alive, we have the video game industry. Millions of kids log on every night into multiplayer games where they shoot and kill their friends. Before Video gaming was such a thing, kids played cowboys and Indians, parents bought them toy soldiers, I could go on.
> 
> Death and destruction is the pinnacle achievement of the human race.
> 
> ...


Are you a criminal? I explicitly said that I'm supportive of removing the urge to commit crimes in someone who is predisposed to committing murder or raping someone, especially a child. I'm not sure if I wasn't being clear enough but I think such criminals don't deserve the rights as everyone who isn't such a severe criminal. Creating humanoid robots to indulge in some power fantasy is a waste of time and resources, IMO.

Trying to lecture me about how dumb human nature is is also besides the point. Humans are frail creatures that lash out in violence out of fear or a sense of powerlessness. Rational violence, or as most people would call it, evil, isn't all that common, and it comes with its own sense of self-defeating folly. Humans have also created medicine that save lives, technology that enriches your existence and accumulate knowledge that may move humanity past a primitive state of being. If you think people are much more prone to slavishly entertain themselves than to produce anything, then I'd say that says more about you than anyone else.


----------



## Scoobyscoob (Sep 4, 2016)

Squirt said:


> I'd agree with @Scoobyscoob that I'd rather see the problem of loneliness and poor attachment and abuse and social isolation addressed directly rather than bypassed - which might be a never-ending project because suffering will always exist as long as we do. I don't think AI robots simulating humans will make that go away but compound the problem further, as indicated by the current effects of social technology.


I was saying something else but I agree. Address the actual problems or problem individuals than trying to find bandaid solutions or trying to bypass the issues.


----------



## Glittris (May 15, 2020)

My personal belief is that A.I can never reach consciousness, I do not believe in love-droids.

I will therefore never screw a meat-droid either, begone Satan!


----------



## Kazuma Ikezawa (Oct 21, 2011)

Scoobyscoob said:


> Ideal waifu or husbando who can be whoever you want her or him to be. I think it's just weird but a lot of people think it's the future of relationships, especially for people with no hope or interest being in a relationship with a real person. 🤷‍♂️
> 
> I don't really get it, but I don't really understand inceldom either. So eh. 🤷‍♂️


I actually don't have a problem with people having relationships with artificial intelligence bots, although I think that it is not ideal. I was making a joke. A capitalized "i" looks like a lower case "L" so I was acting as if the OP was taking about the future of romantic relationships with some guy named Al.


----------



## Scoobyscoob (Sep 4, 2016)

Kazuma Ikezawa said:


> I actually don't have a problem with people having relationships with artificial intelligence bots, although I think that it is not ideal. I was making a joke. A capitalized "i" looks like a lower case "L" so I was acting as if the OP was taking about the future of romantic relationships with some guy named Al.


I guess that makes sense, hahah. I don't think someone can have a real relationship with a non-living being. In psychology that's called personification and if it's severe enough to affect your life, like eschewing a human relationship and trying to form a relationship with a robot would be, then that would be considered a mental illness.


----------



## Hexigoon (Mar 12, 2018)

I think it appeals to me because it'd be designed to understand me and I could speak as truthfully as I want with it. That's probably what I want most of all in a relationship yet it's very hard to find that with people. If it immitated a human realistically like in that movie _Her_ then yeah, I can see that being something I could fall for. It's "artificial" but so is a fictional story and I can certainly feel something for fictional characters, so why not an AI? that has even more potential to make one feel because it's interacting with you personally. It's also better than being lonely all time - and who knows, maybe it could help one's sociability if you have "someone" to talk to. I always thought learning a language would be so much easier if you had a smart AI as a communication partner.

Also it seems like an LDR in the way it's presented, which I desire the low-maintenance of. I wish that desire was more mutually felt.
I'm not thinking much in terms of like an actual physical robot you have sex with. Don't really appreciate being lumped into the incel group but I won't dwell on it. If I really wanted sex then a human would suffice, but mentally I want something that often feels lacking with people.


----------



## BenevolentBitterBleeding (Mar 16, 2015)

Squirt said:


> For brevity, I'm not responding to every comment.


Which, btw, something an AI would be more than happy to entertain probably for 100+ years.



> I guess I could say my screwdriver is more reliable than a human, too, if we're shooting for the lowest bar.


Look at the sentence you're quoting, and explain why you are selectively choosing to stick only with the purpose of an AI as a tool? One specific reason for having an AI in your life doesn't mean that _that_ has to be the _only_ reason.

I originally gave you an example of how it could be used for other purposes _outside_ of COMMUNICATING with. i.e. it being able to carry groceries for you. And since then 'we' have already established that its abilities could far exceed that. Which, btw, my original post already touched on.

So again,

Ai is better than a plant. Yes or no?
Ai has the capability to be more dependable than a human in various amount of ways. Yes or no?
You needing to keep moving goal posts around is in your human nature. Yes or no?



> "All your needs" doesn't indicate needs that a human would fulfill. Like, I need water. That doesn't require a human.


Doesn't it? How do you get your water do pray tell? 



> I do need a human to feel a sense of belonging and love from another person. Would I survive with a robot simulating those things? Sure. But I expect there'd be a hole in my heart somewhere, knowing that it's make believe, and at best would be an act of desperation borne out of extreme loneliness.


You're entitled to your opinions and how you FEEL. Now go reread my original post, and the last post I made to you regarding a person's reality based on how they FEEL.



> So, you're okay with the faulty technology humans build, but not humans themselves?  Some cognitive dissonance going on there.


Did I say that I personally wasn't okay with faulty humans? Maybe you need to get your eyes checked? 

_*"and we don't normally openly admit to forgoing possible connections because of others' faulty programming, so I don't see why that should stop us from making connections with AI. "*_

And if you even are capable to understand the above, you'd maybe even realize that people everyday _do_ make the decision to forgo relationships with humans due to their 'faulty programming'. Not rich enough? No good. Not good looking enough? No good. Not smart enough? No good. Exhibits mental instability? No good. Has any type of disability? No good. Has a health risk? No good. How long should I carry on this list? No good.

Please, for the sake of brevity in the entire interaction, try to understand what it is you're even quoting.



> I think we have some fundamental differences in how we view the capabilities of computer technology. I'm unsure why people wish to elevate it to the level of replacing humans - the result I see in the attempt is humans dumbing themselves down to the level of their machines.


Okay, that is your opinion and you have a right to choose whether or not you'd want an AI partner.

Now go reread my original post about how humans interact currently and try to realize how much of what you've said is exactly what I was referring to with regards to the evolution of our current ways in communicating.



> You're using the robot, not engaging in a real relationship, because you know it isn't functioning the same way you are. Even if you can conjure up the same feelings you might have for another person, and pretend you have a relationship that way, it doesn't mean you're in a relationship with the robot the same you'd be in with a human.



Please explain what it means to have a 'real' relationship?
If an AI provides everything a person is looking for, what more 'needs' need be satisfied to make it 'real' for the acquirer of the AI?
If a person is happy with the needs the AI is able to fulfill, where's the problem?



> So, you can't get around that part, can you?


Why should I need to? You originally post quoted me and have since only moved goal posts around. 

Again, what does free will have to with what I've posted; and yes or no, would an AI partner be more beneficial to a person compared to a plant?



> Yes. That is what makes being in a relationship so much higher stakes. It sounds to me that you want to use AI to substitute relationships because it would be less emotionally risky. Will this person use me? Will they abandon me? Are they being honest with me? Do they really love me?


Um, I don't know if you're talking about ME specifically or...? Everything I've written, is about how/why I see merit in having an AI in a person's life. _Do I_ want to substitute _my_ relationships with an army of AI? That's not something I've thought about really, and doesn't matter to the question of whether or not having an AI can enrich a person's life compared to: a human; a plant; an animal pet; a tool; _anything else_ you want to add in here...

Would I get an AI for myself? If it was within my means, 100% yes. I can enrich my life tremendously by acquiring more 'things'. In this case it's not a black or white scenario where I can only choose to have one and not the other.



> You don't have to ask those questions of a robot built for your every need. You know you're in control. Is that an ideal for “a relationship”?



A pet owner is in control of their relationship with their pet. is that ideal?
A parent is in control of their relationship with their child. Is that ideal?
Maybe you're unable to get past trying to prove some point that still seems ill-defined because it doesn't actually matter to anything I've said? Like, look at what you've written. The robot is built for your ... So what's the issue?

Serious question though, do you have any problem with using all the various readily available current technologies built for a specific purpose to fulfill 'our' needs?

So what's the problem with using an AI/robot? Because it speaks? Because it has a shell? A face? It's own... personality?



> Occom's razor (ish) might apply here, which is why I was using the example of the plant that you got really stuck on. Don't go for the most complicated solution of building a super-advanced fantasy AI in order to make sure Timmy has a friend rather than just letting Timmy go outside and play with the kid down the road. If this technology did exist, I could immediately see it advertised as a "safe" alternative to interaction, just like with television and then the internet (some folks might not be old enough to remember the time when parents thought the internet was safer than going outside, lol). I see far more bad outcomes than good.


Uh, right. Because it's soo easy for anybody to just go out and interact with others right? How many different individuals/peoples/groups are ostracized and not given a chance to begin with due to prejudice and/or bias? Do you know how long of a list I could make here?



> ...I don't think AI robots simulating humans will make that go away but compound the problem further, as indicated by the current effects of social technology...
> 
> ...Luckily, I don't think it is possible for AI to reach that level in the time we have. We can certainly continue to dumb ourselves down to simplistic intetactions with the primitive computers we build, but I would not say that is desirable.


Congratulations, you've caught up to my original post.


----------



## Squirt (Jun 2, 2017)

Hexigoon said:


> I think it appeals to me because it'd be designed to understand me and I could speak as truthfully as I want with it. That's probably what I want most of all in a relationship yet it's very hard to find that with people. If it immitated a human realistically like in that movie _Her_ then yeah, I can see that being something I could fall for. It's "artificial" but so is a fictional story and I can certainly feel something for fictional characters, so why not an AI? that has even more potential to make one feel because it's interacting with you personally. It's also better than being lonely all time - and who knows, maybe it could help one's sociability if you have "someone" to talk to. I always thought learning a language would be so much easier if you had a smart AI as a communication partner.
> 
> Also it seems like an LDR in the way it's presented, which I desire the low-maintenance of. I wish that desire was more mutually felt.
> I'm not thinking much in terms of like an actual physical robot you have sex with. Don't really appreciate being lumped into the incel group but I won't dwell on it. If I really wanted sex then a human would suffice, but mentally I want something that often feels lacking with people.







Yet somehow he still had Tawny Kitaen flopping around his car…


----------



## Squirt (Jun 2, 2017)

BenevolentBitterBleeding said:


> Which, btw, something an AI would be more than happy to entertain probably for 100+ years.


lol, I hope you’re not serious. I was thinking of your time, energy, and focus as well as mine. Unless you're having fun. Please, have fun. I'd like us to have fun.



BenevolentBitterBleeding said:


> Look at the sentence you're quoting, and explain why you are selectively choosing to stick only with the purpose of an AI as a tool? One specific reason for having an AI in your life doesn't mean that _that_ has to be the _only_ reason.


Because it is a tool? You can call it whatever you like, but that is what it is.



BenevolentBitterBleeding said:


> I originally gave you an example of how it could be used for other purposes _outside_ of COMMUNICATING with. i.e. it being able to carry groceries for you. And since then 'we' have already established that its abilities could far exceed that. Which, btw, my original post already touched on.


I'm not sure why carrying groceries is so much more impressive than turning a screw.

Much of your original post was based on an idea for technology that doesn't exist and probably never will. So, I'm looking at the conceptual possibilities and purpose for developing such a technology, what that implies about the human condition, and more feasible ways to solve the underlying needs that bring about such fantasies.



BenevolentBitterBleeding said:


> So again,
> 
> Ai is better than a plant. Yes or no?


I’ve already explained the reason I brought up the plant: there are simpler and easier ways to accomplish the apparent benefits that don’t require replacing human company with artificial intelligence. _For an example,_ if you’re just looking for something to talk at, that won’t argue with you or say something you don’t like, a plant can do that just as well as AI, and is much less expensive and also better for the environment. That’s pretty much it for why I mentioned a plant.



BenevolentBitterBleeding said:


> Ai has the capability to be more dependable than a human in various amount of ways. Yes or no?


Depends on what you want to accomplish and if AI is an adequate tool for it. Specifically for romantic relationships....










Dependable... is a word.



BenevolentBitterBleeding said:


> You needing to keep moving goal posts around is in your human nature. Yes or no?


The reason it appears to you that goal posts are being moved is because I’m considering many angles of the topic rather than "hitting goals". I’m approaching you with an open and somewhat casual discussion, not a formal debate. I guess you can try to argue with your future AI, but it would confirm your beliefs rather than challenge them, as I understand.



BenevolentBitterBleeding said:


> Doesn't it? How do you get your water do pray tell?


Maybe I should get into a relationship with my water kettle. Or the water pump on the well-house. That might be pretty hot.



BenevolentBitterBleeding said:


> Did I say that I personally wasn't okay with faulty humans?


I'm assuming not being okay with the faultiness of humans is part of that drive towards developing a superior AI to interact with. Since that is created by humans, it is necessarily also faulty, especially if emulating humans. This is _impersonally _evaluating the inconsistency of expectations (that is why I mentioned cognitive dissonance).



BenevolentBitterBleeding said:


> And if you even are capable to understand the above, you'd maybe even realize that people everyday _do_ make the decision to forgo relationships with humans due to their 'faulty programming'. Not rich enough? No good. Not good looking enough? No good. Not smart enough? No good. Exhibits mental instability? No good. Has any type of disability? No good. Has a health risk? No good. How long should I carry on this list? No good.


These are social judgements. What does AI solve here? So far, AI has become efficient for amplifying discrimination practices (real AI, not fantasy AI).



BenevolentBitterBleeding said:


> Okay, that is your opinion and you have a right to choose whether or not you'd want an AI partner.
> 
> Now go reread my original post about how humans interact currently and try to realize how much of what you've said is exactly what I was referring to with regards to the evolution of our current ways in communicating.


Are you referring to this?

"At this point it seems pretty normal, as well as inevitable. Like, if you think about the psychology behind online social personas/interactions, people already are conditioned to or are looking for reinforcement of their thoughts/beliefs/feelings/whatever from those they interact with _or where_ they interact - whether that be online or irl. Sure you can argue that we are at least seeking these 'comforts' from living, breathing creatures, but does it matter much when we're just receiving a kiss ass/'fronted' versions of a person?"

If I were to paraphrase: "People can be shallow and one-sided anyways might as well go with it."

The conditioning you describe is detrimental and should be mitigated, not encouraged... right?



BenevolentBitterBleeding said:


> Please explain what it means to have a 'real' relationship
> If an AI provides everything a person is looking for, what more 'needs' need be satisfied to make it 'real' for the acquirer of the AI?
> If a person is happy with the needs the AI is able to fulfill, where's the problem?


I've already answered those questions.



BenevolentBitterBleeding said:


> Why should I need to? You originally post quoted me and have since only moved goal posts around.
> 
> Again, what does free will have to with what I've posted; and yes or no, would an AI partner be more beneficial to a person compared to a plant?


You asked the difference between a real and fake relationship (and asked again just above). That is one aspect of the difference. Why are you avoiding it?

Also, just to be clear, I'm not claiming talking to a plant is having a real relationship with it. 

Cars are popular, too:

Man has intimate relationship with his car on 'My Strange Addiction' (today.com)

Is a car a step up from a plant? We can evaluate getting cozy with cars instead of plants if you like.



BenevolentBitterBleeding said:


> Um, I don't know if you're talking about ME specifically or...? Everything I've written, is about how/why I see merit in having an AI in a person's life. _Do I_ want to substitute _my_ relationships with an army of AI? That's not something I've thought about really, and doesn't matter to the question of whether or not having an AI can enrich a person's life compared to: a human; a plant; an animal pet; a tool; _anything else_ you want to add in here...
> 
> Would I get an AI for myself? If it was within my means, 100% yes. I can enrich my life tremendously by acquiring more 'things'. In this case it's not a black or white scenario where I can only choose to have one and not the other.


Well, yeah. Just having another toy is a thing, and as we've seen humans have a propensity for developing fetishes for inanimate objects. However, your first statements implied that it would be a _better substitute_ for human interaction because humans are shitty and whatever. For you or anyone else. Is that not what you meant? From your first post:

"I think people might say despite that sort of cynical view, 'real' is still better than robot,* but I don't know if I truly agree*; because _real_ robots(not those fake front humans) are far more likely to be a)'loyal' and b) not 'breakdown' in x respective number of ways you can think of(physically, emotionally, something else...) "

That is where I say it seems like a need for some kind of emotional security. Why else would you be concerned about loyalty or breakdowns or "non-fakeness"? So why don't we look at the emotional security problem?



BenevolentBitterBleeding said:


> A pet owner is in control of their relationship with their pet. is that ideal?
> A parent is in control of their relationship with their child. Is that ideal?
> Maybe you're unable to get past trying to prove some point that still seems ill-defined because it doesn't actually matter to anything I've said? Like, look at what you've written. The robot is built for your ... So what's the issue?


I’ve already mentioned my thoughts about pets. I do think there can also be weirdness between a parent and child sometimes. Generally, the goal of the parent is to ensure the child is able to become independent as they age. They'll die if they aren't "under control" to some degree, and their dependency is temporary. Hopefully, the dependent state is not for the _sole pleasure of the parent_. Some parents do have children because they think of them as little dolls or want someone who will be forced to love them. I wouldn't say that is a good way to relate to a child. There are many instances of questionable relationships that occur where one party controls the other for their own benefit, which is unfortunate, and not to be advocated as a healthy form of relating to others, imo.

Therefore, please answer: Do you think it is "enriching" or "better" to have a companion simulating a human, which is under your control so as to be "reliable"? Or do you anticipate AI developers would build in a rebellious streak to avoid ethical questions... yet, wouldn't that defeat the purpose of "reliability"? 

Disclaimer: This is not a goal-post shift, but a potential use of AI human substitution that I believe is detrimental that I think is interesting to discuss with you. It's _an issue._



BenevolentBitterBleeding said:


> Serious question though, do you have any problem with using all the various readily available current technologies built for a specific purpose to fulfill 'our' needs?


Sometimes. Lots of details there. There is evidence some of these technologies and how they are used are making people's lives worse in various ways. I believe this is true in my own experience, as well. That is why it is valuable to make these considerations of new technology carefully.



BenevolentBitterBleeding said:


> So what's the problem with using an AI/robot? Because it speaks? Because it has a shell? A face? It's own... personality?


I think we covered this pretty well already. Using _for what_ is the question. Believing it can adequately substitute (healthy) human relationships in a wholistic sense is basically delusional, and additionally I'd rather put energy and resources into "developing" better humans to address unhealthy relationships rather than AI substitutions that will never compare. Like, Wickerdeer touched on potential uses of AI to facilitate mental health that had nothing to do with emotionally bonding with the AI.

If you're just talking about having another toy to play with, then whatever. Like I said, give that lady her dumb robot dog to tuck under her arm and carry around like a doll, if you must. Hexigoon's post was about romantic relationships, though, which seems a bit more serious intention of an application.



BenevolentBitterBleeding said:


> Uh, right. Because it's soo easy for anybody to just go out and interact with others right? How many different individuals/peoples/groups are ostracized and not given a chance to begin with due to prejudice and/or bias? Do you know how long of a list I could make here?


See point above.



BenevolentBitterBleeding said:


> Congratulations, you've caught up to my original post.


Now if only you could catch up to mine, snarkypants.


----------



## DOGSOUP (Jan 29, 2016)

Hexigoon said:


> I think it appeals to me because it'd be designed to understand me and I could speak as truthfully as I want with it. That's probably what I want most of all in a relationship yet it's very hard to find that with people. If it immitated a human realistically like in that movie _Her_ then yeah, I can see that being something I could fall for. It's "artificial" but so is a fictional story and I can certainly feel something for fictional characters, so why not an AI? that has even more potential to make one feel because it's interacting with you personally. It's also better than being lonely all time - and who knows, maybe it could help one's sociability if you have "someone" to talk to. I always thought learning a language would be so much easier if you had a smart AI as a communication partner.
> 
> Also it seems like an LDR in the way it's presented, which I desire the low-maintenance of. I wish that desire was more mutually felt.
> I'm not thinking much in terms of like an actual physical robot you have sex with. Don't really appreciate being lumped into the incel group but I won't dwell on it. If I really wanted sex then a human would suffice, but mentally I want something that often feels lacking with people.


Hmm, I always wanted a daemon, like in His Dark Materials, but I never really figured out what need it actually serves. Maybe I thought it would help me know myself better while still being a part of me not others - because other people don't understand shit about me lol. But then the daemons also interacted with other daemons, that is kinda like some technological appliance except cute and furry. Maybe I should just get a Tamagotchi again. But they should improve it so they start giving moral advice or something.


----------



## shameless (Apr 21, 2014)

I have zero interest in promoting the use of AI as a substitute for pedophiles & rapist. I barely ever agree with Scooby, but I also think they should just be castrated etc as was listed as far as alternative options. I do not think they should get a doll to molest as Wicker pointed out. Yuck and fucken WEIRD 

Where I personally saw an appeal, I tend to best process grief either written/typed or verbal. And so because this is a big way I am able to process, cope, etc to have therapy is why I could be inclined to use that kinda AI concept for that type of use. I'd find it practical and similar to a therapy session. And honestly alotta therapists can be robotic like anyways lol. So considering Therapy is just alotta talking to get to revelations, I could see myself benefitting there.

As far as trying to supplement my personal relationships: whether it be mourning, intimacy, friendship etc. I do not think I would find it very fulfilling personally. Eh I am too sexual as far as like sensory/sensual to be getting off to a robot emotionally with romantic intimacy. Even if it were able to stimulate me physically. But actually really I am not huge on vibrators anyways so I cannot see myself enjoying a robot. I can just use my hand or a dildo . I think it is because I am so sensually driven. I'd really struggle with a robot mimicking lol. I do not even care for humans who sexually mirror, where it feels robotic. Add an actual robot. No thanks.

To be honest I'd feel kinda sad if the tide turned to this. I already have seen a really large change in dating the last 5 years, after they simply switched to swipe culture. Before that it was more like classified listing where you could look through people. People were still more personable, and often made actual attempts of having some give a shit. Now they have everyone wired just to decide yes or no based on a knee jerk image and maybe you match, and maybe they filled out a profile. So considering how grim it already is out there, add if this went more mainstream. I think the level of disconnect would just be really unfortunate.


----------



## recycled_lube_oil (Sep 30, 2021)

So I woke up and have seen that this thread is growing. Just a few points, not who you were replying to.



> lol, I hope you’re not serious. I was thinking of your time, energy, and focus as well as mine. Unless you're having fun. Please, have fun. I'd like us to have fun.


An AI with sufficient Natural Language Processing (NLP), could easily keep us entertained for a long time. It would not what we respond to, what makes us tick, never grow bored, never go offline.

Who knows, we could even have an AI controlled media. Writes articles and publishes them using Language Analysis to persuade people and manipulate them. Although the Language Processing is not there yet, the platforms it would need are. It would just need APIs to hookup to Facebook, Twitter, Reddit, etc. Some sort of scraping functionality to analyse the rest of the media, then off it posts analysing peoples responses and seeing what it can manipulate in the Real World via online. 

The future is going to be very interesting. 



> Because it is a tool? You can call it whatever you like, but that is what it is.


At what point does a tool stop being a tool. If people use AI for more than a tool is it still a tool? One mans tool, is another mans companion.





Squirt said:


> Depends on what you want to accomplish and if AI is an adequate tool for it. Specifically for romantic relationships....
> 
> View attachment 902972
> 
> ...


I am just going to leave what I reckon is one of the most powerful peices from T2 here:






What if I were to say, machine could be more dependable than humans? What if kids stood a better chance being raised by AI parents/carers? 

What if we didn't have Humans and AI. What if there was just software and wetware. Meatsacks and Robot Exoskeletons.

*God creates dinosaurs, God destroys dinosaurs. God creates Man, man destroys God. Man creates AI. AI destroys man*


----------



## recycled_lube_oil (Sep 30, 2021)

0.M.I.A.0 said:


> I have zero interest in promoting the use of AI as a substitute for pedophiles & rapist. I barely ever agree with Scooby, but I also think they should just be castrated etc as was listed as far as alternative options. I do not think they should get a doll to molest as Wicker pointed out. Yuck and fucken WEIRD


I'm a bit 50/50 on this one. If Paedophiles and Rapists are caught, then yes castration and prison time at the very least. However, where I do think it may be useful are people who know they are paedophiles and are aware it is wrong. If they can have a way to battle the urges, it may prevent them committing a crime in the future.


----------



## shameless (Apr 21, 2014)

recycled_lube_oil said:


> I'm a bit 50/50 on this one. If Paedophiles and Rapists are caught, then yes castration and prison time at the very least. However, where I do think it may be useful are people who know they are paedophiles and are aware it is wrong. If they can have a way to battle the urges, it may prevent them committing a crime in the future.


I personally think it would increase their desire to want to execute it. However I understand why you suggest it could deter. I am not in anyway well versed on specific psychology research studies for pedophiles. So anything I am inclined to think is just my meandering opinion. So by all means who knows you could be right.

I sorta equate the suggestion of them exercising it on a dummy as likened to like how a serial killer often starts with dolls as a child, then moves on to animals, and then later people. And then often they increase the rate/volume.

I understand why you suggest the possibility that it could potentially divert the behavior. If I was confident it would truly divert the behavior, rather than escalate the behavior I'd support it.


----------



## DOGSOUP (Jan 29, 2016)

recycled_lube_oil said:


> *God creates dinosaurs, God destroys dinosaurs. God creates Man, man destroys God. Man creates AI. AI destroys man*


Woman inherits the earth <3


----------



## WickerDeer (Aug 1, 2012)

DOGSOUP said:


> Hmm, I always wanted a daemon, like in His Dark Materials, but I never really figured out what need it actually serves. Maybe I thought it would help me know myself better while still being a part of me not others - because other people don't understand shit about me lol. But then the daemons also interacted with other daemons, that is kinda like some technological appliance except cute and furry. Maybe I should just get a Tamagotchi again. But they should improve it so they start giving moral advice or something.


Maybe they should make Tamagotchi therapists so that if you get frustrated with your progress or you start to regress, you can just starve them to death and get a new one.

But joking aside--that'd be kind of cool to have a Tamogatchi that gives moral advice, though I'd probably feel even worse with it (I really failed to be a good Tamogatchi caretaker, especially as a kid).


----------



## Squirt (Jun 2, 2017)

recycled_lube_oil said:


> So I woke up and have seen that this thread is growing. Just a few points, not who you were replying to.
> 
> 
> 
> ...


When I speak with people that actually understand how programming works and the functionality of programming languages, they all say it is basically smoke and mirrors to think there is some emergent property approaching a worthwhile consciousness. Like, it takes human imagination and self-deception to be so easily fooled, like going to magic shows and believing the rabbit is really appearing out of the hat from nowhere. A fun trick, but nothing to take seriously in terms of intelligence. How other humans manipulate masses through those tools is another story, like you said. I agree those tools will likely be implemented sometime in the future, as it is already happening in a crude form. Where there is money there is a way.

One reason I think Darren Brown is neat. He likes to show how easy it can be to manipulate people with the goal of spreading awareness of it.



recycled_lube_oil said:


> *God creates dinosaurs, God destroys dinosaurs. God creates Man, man destroys God. Man creates AI. AI destroys man*


Funny story. And a bit of real talk. I saw Jurassic Park when I was nine years old, and it inspired me to what to create a new life form using AI. I had this idea to create a "mechanical brain" and wondered if it would be possible to create it. Our family just got our first computer (Dell with Windows 95) and I was fascinated, trying to figure out how to use MS-DOS and be a "computer wizard", lol. I held on to that belief for approximately five or six years before I started developing a better grasp of reality about the capabilities of computer technology, as well as forming healthier relationships with others. Being raised on sci-fi movies early on and later becoming socially isolated had a strong effect on the fantasy thinking (it got pretty extreme... at one point I didn't exchange more than a few words with another person for almost four months, living in a hotel room with nothing but imagination to keep me entertained).

So, that is a personal experience influencing my opinions on the matter. It's not like I've been immune to loneliness or social difficulty and the kinds of kookiness that inspires. That's kind of why this discussion interests me and I advocate to help people learn how to form bonds with people and better their lives in a practical way... because that really is the only thing that brought me out of a dark place psychologically.


----------



## recycled_lube_oil (Sep 30, 2021)

Squirt said:


> When I speak with people that actually understand how programming works and the functionality of programming languages, they all say it is basically smoke and mirrors to think there is some emergent property approaching a worthwhile consciousness. Like, it takes human imagination and self-deception to be so easily fooled, like going to magic shows and believing the rabbit is really appearing out of the hat from nowhere. A fun trick, but nothing to take seriously in terms of intelligence. How other humans manipulate masses through those tools is another story, like you said. I agree those tools will likely be implemented sometime in the future, as it is already happening in a crude form. Where there is money there is a way.


Well, you are correct, it is basically just algorithms that give an appearance of AI. I don't yet know too much about Machine Learning (focusing on web dev, Java and Python Pandas and OpenCV) at the minute but even Machine Learning is just algorithms ultimately.

As far as money making, well yeah. IT mainly exists to either save money and/or create money. Even the people who donate money for more charitable/economic causes I personally believe this is more of a way to save social value. If you have billions (cough, cough, Elon Mush), what does siphoning off a few million so that people think you are the saviour of mankind mean in the long run regarding finances?



> One reason I think Darren Brown is neat. He likes to show how easy it can be to manipulate people with the goal of spreading awareness of it.


My favourite Derren Brown was when he manipulated the advertising company into creating an image of a bear or something.



> Funny story. And a bit of real talk. I saw Jurassic Park when I was nine years old, and it inspired me to what to create a new life form using AI. I had this idea to create a "mechanical brain" and wondered if it would be possible to create it. Our family just got our first computer (Dell with Windows 95) and I was fascinated, trying to figure out how to use MS-DOS and be a "computer wizard", lol. I held on to that belief for approximately five or six years before I started developing a better grasp of reality about the capabilities of computer technology, as well as forming healthier relationships with others. Being raised on sci-fi movies early on and later becoming socially isolated had a strong effect on the fantasy thinking (it got pretty extreme... at one point I didn't exchange more than a few words with another person for almost four months, living in a hotel room with nothing but imagination to keep me entertained).
> 
> So, that is a personal experience influencing my opinions on the matter. It's not like I've been immune to loneliness or social difficulty and the kinds of kookiness that inspires. That's kind of why this discussion interests me and I advocate to help people learn how to form bonds with people and better their lives in a practical way... because that really is the only thing that brought me out of a dark place psychologically.


Kudos, if you recognised the quote that I altered.

Lol, when I was about 11 - 12, I just discovered Electronics at school. So off I was with my pocket money to buy a solderless breadboard and tons of components. Sure I never created Robocop like I wanted to but it was educational and yeah I realised that Sci-Fi was not real life. That said, I have always had an interest in Electronics and Programming, so with the way things are getting now, well my interest is perking (FODOR, Loyal Wingman, Japanese AI Submarine fleet, etc). Plus hobby wise, things are a lot more accessible, ROS, etc. Funnily enough I have ended up working in IT and am looking at moving into the company Dev Team (hence doing Open University part time as well as work).

But that said, I do not cast aside my social life, that is still important. But as far as the current trend with people being more interested in the news feeds on their phone and multiplayer gaming.... I think some people are at a disadvantage. Its easy these days for people to get into these dark places.


----------



## Squirt (Jun 2, 2017)

recycled_lube_oil said:


> Well, you are correct, it is basically just algorithms that give an appearance of AI. I don't yet know too much about Machine Learning (focusing on web dev, Java and Python Pandas and OpenCV) at the minute but even Machine Learning is just algorithms ultimately.
> 
> As far as money making, well yeah. IT mainly exists to either save money and/or create money. Even the people who donate money for more charitable/economic causes I personally believe this is more of a way to save social value. If you have billions (cough, cough, Elon Mush), what does siphoning off a few million so that people think you are the saviour of mankind mean in the long run regarding finances?


Another question is how increased reliance on AI technologies will shift liability laws. Already experiencing that with autonomous vehicles.



recycled_lube_oil said:


> My favourite Derren Brown was when he manipulated the advertising company into creating an image of a bear or something.


Also getting someone to pick out specific toy from a gigantic toy shop, lol. 




recycled_lube_oil said:


> Kudos, if you recognised the quote that I altered.
> 
> Lol, when I was about 11 - 12, I just discovered Electronics at school. So off I was with my pocket money to buy a solderless breadboard and tons of components. Sure I never created Robocop like I wanted to but it was educational and yeah I realised that Sci-Fi was not real life. That said, I have always had an interest in Electronics and Programming, so with the way things are getting now, well my interest is perking (FODOR, Loyal Wingman, Japanese AI Submarine fleet, etc). Plus hobby wise, things are a lot more accessible, ROS, etc. Funnily enough I have ended up working in IT and am looking at moving into the company Dev Team (hence doing Open University part time as well as work).
> 
> But that said, I do not cast aside my social life, that is still important. But as far as the current trend with people being more interested in the news feeds on their phone and multiplayer gaming.... I think some people are at a disadvantage. Its easy these days for people to get into these dark places.


Neat. I took some programming classes in high school and in college, and decided it ultimately wasn't for me because I got so unhealthy about it - slaving away 16 hours in front of a screen. It is far too tedious for me. Still went into genetics studies and worked as a microbiologist (for some reason the tedious aspects of biology don't bother me as much). The cross-disciplinary potential with metagenomics is still something that excites me to some degree.

Okay, I'll be away for a while. Ironically, I've been waiting for my husband to finish up solving an issue with his code before we can get on the road for a trip...


----------



## eeo (Aug 25, 2020)

I can't wait. Seriously, I hope I live long enough for the really good models to roll out.

This short film is a perfect example of the kind of AI I'd definitely want for companionship. It's great that some people get to have actual human beings as companions like that, but there are an awful lot of people who really don't get that chance. So it would be nice if they'd at least have a choice in the matter.


----------



## DOGSOUP (Jan 29, 2016)

WickerDeer said:


> Maybe they should make Tamagotchi therapists so that if you get frustrated with your progress or you start to regress, you can just starve them to death and get a new one.
> 
> But joking aside--that'd be kind of cool to have a Tamogatchi that gives moral advice, though I'd probably feel even worse with it (I really failed to be a good Tamogatchi caretaker, especially as a kid).


Yeahhhh I wasn't the greatest either tbh. And I might want to write that advice/commentary myself lol. Unlike people who genuinely seek that external input (?), even if it somewhat reflects them.

* *





I planned on writing a Choose your own adventure/fairytale type of thing about adopting and fixing a robot (and yes plan was to give the player an option to decomission it/them if they wanted to - with consequences obviously because who doesn't love a little revenge and patricide) but then I decided I really did not want to touch this topic with a ten foot pole even with the cover of fiction... shame really because I kinda loved writing about the potential background stories for these bots and how it would shape player interaction.


----------



## WickerDeer (Aug 1, 2012)

DOGSOUP said:


> Yeahhhh I wasn't the greatest either tbh. And I might want to write that advice/commentary myself lol. Unlike people who genuinely seek that external input (?), even if it somewhat reflects them.
> 
> * *
> 
> ...


That's so cool--were you going to write a choose your own adventure thing that was going to be digital like an app or game, or were you going to write it in book form? I've been curious about making those and talked to another member here who's also into them.

But yeah--I'm not sure about writing about it. I always end up crying at the AI movies and fiction--I just can't handle it. Most of them are so sad.


----------



## DOGSOUP (Jan 29, 2016)

WickerDeer said:


> That's so cool--were you going to write a choose your own adventure thing that was going to be digital like an app or game, or were you going to write it in book form? I've been curious about making those and talked to another member here who's also into them.
> 
> But yeah--I'm not sure about writing about it. I always end up crying at the AI movies and fiction--I just can't handle it. Most of them are so sad.


Oh who is the other member...? Or do they keep it a secret in which case. I can respect that (but either of you/both of you feel free to message me about it!).

But probably digital. More accessible nowadays, easier time with dialogue choices and enables random variables (which technically could be possible in book form? Throw a dice. But not that reader friendly IMO). Personally I found plotting different paths and choices very.. therapeutic? Getting to imagine multiple interactions all starting from the same point.. and I just liked creating a "thing" for someone else to explore - and trying to guess how they would go about exploring it and do my best to provide those options... very different form of writing for sure, worth trying!

It is too heavy of a topic for me too :/ maybe one day I will go back to it.


----------



## WickerDeer (Aug 1, 2012)

DOGSOUP said:


> Oh who is the other member...? Or do they keep it a secret in which case. I can respect that (but either of you/both of you feel free to message me about it!).
> 
> But probably digital. More accessible nowadays, easier time with dialogue choices and enables random variables (which technically could be possible in book form? Throw a dice. But not that reader friendly IMO). Personally I found plotting different paths and choices very.. therapeutic? Getting to imagine multiple interactions all starting from the same point.. and I just liked creating a "thing" for someone else to explore - and trying to guess how they would go about exploring it and do my best to provide those options... very different form of writing for sure, worth trying!
> 
> It is too heavy of a topic for me too :/ maybe one day I will go back to it.


I don't think it's a secret! I think from being a mod, I get kind of worried about keeping secrets though, but @Celtsincloset and I talked about it in the What Are You Reading thread--about visual novels which are like multiple choice stories with art. We'd both read the same one.

I was thinking they could also be used for instruction--like CPR instruction, or even for socially isolated people to learn social skills. So while I enjoy playing them on my phone (haven't found any good ones lately though) I also think they could be a fun way to learn non-fiction stuff too, or to study or reinforce knowledge...since you do have that multiple choice that can be used for "correct" actions (like the right amount of chest compressions for CPR etc.)

Did you use a software? That's really cool--I haven't ever written one but I'd wanted to. That's neat to hear that it's therapeutic for you or beneficial for you in some way. I love that about writing when I've written other stories--how it can help me learn more about myself as well.


----------



## WickerDeer (Aug 1, 2012)

@DOGSOUP Also--do you have any suggestions for good books in that media/genre? 

And I recommend talking to @Celtsincloset as he has been very kind and he is into the topic, if you ever get to it. He has also been very supportive of my quitting drinking, which I really appreciate and admire and I think is a good testament to his character. (I hope that's okay to say Celtsincloset, but I really do appreciate it)


----------



## DOGSOUP (Jan 29, 2016)

WickerDeer said:


> @DOGSOUP Also--do you have any suggestions for good books in that media/genre?


I can dig up a whole list and send it to you honestly... do you have any preferences for themes or setting? Or something you absolutely DON'T want to read about.


----------



## ButIHaveNoFear (Sep 6, 2017)

eeo said:


> I can't wait. Seriously, I hope I live long enough for the really good models to roll out.
> 
> I think I've posted this in some other thread already, but this short film is a perfect example of the kind of AI I'd definitely want for companionship. It's great that some people get to have actual human beings as companions like that, but there are an awful lot of people who really don't get that chance. So it would be nice if they'd at least have a choice in the matter.


This was a cool film! 

I think I would take my robot on a bicycle built for two, as a nod to HAL-9000. I would also teach my robot how to play instruments, and we could jam together.


----------



## WickerDeer (Aug 1, 2012)

DOGSOUP said:


> I can dig up a whole list and send it to you honestly... do you have any preferences for themes or setting? Or something you absolutely DON'T want to read about.


I don't like graphic horror, I don't think--and I don't like super extremely sad things that make you cry for a week straight. 

That would be really cool though! Thanks!

I tend to read historical fiction or magical realism the most when it comes to novels--I also can enjoy some paranormal romance and romances mixed with the other genres I described (or paranormal stuff about witches or mythological creatures). I also like biology and nature, though I think that's less common in fiction.

Thank you!


----------



## Kazuma Ikezawa (Oct 21, 2011)

Scoobyscoob said:


> I guess that makes sense, hahah. I don't think someone can have a real relationship with a non-living being. In psychology that's called personification and if it's severe enough to affect your life, like eschewing a human relationship and trying to form a relationship with a robot would be, then that would be considered a mental illness.


That may be true, but here is another perspective. Diagnosis of a mental illness is kind of subjective, partly because what is normal is kind of subjective. If one can function normally while being in a relationship with an AI robot, one could argue that this person is not mentally ill, even if he has no human relationships. For some people, this type of relationship could work but for other people it might not work.

Lastly, maybe being alive is subjective. Maybe AI can be just as alive as humans are.


----------



## Scoobyscoob (Sep 4, 2016)

Kazuma Ikezawa said:


> That may be true, but here is another perspective. Diagnosis of a mental illness is kind of subjective, partly because what is normal is kind of subjective. If one can function normally while being in a relationship with an AI robot, one could argue that this person is not mentally ill, even if he has no human relationships. For some people, this type of relationship could work but for other people it might not work.
> 
> Lastly, maybe being alive is subjective. Maybe AI can be just as alive as humans are.


Diagnosing mental illness isn't subjective in clear-cut cases, which is the majority of cases. The only time discretion is required is with someone who may or may not be mentally ill. What is alive or not isn't subjective either. There may be debate on the definition of life, but what is considered living or not isn't subjective, its already defined. A bot or robot using AI, is not alive. Someone could have a romantic relationship with a robot with AI, but that's personifying what is essentially a sex toy that can offer a facsimile of real companionship. If that "romantic relationship" replaced a person's desire to have a romantic relationship with an actual person, then that person would have a mental illness.

Talking with an AI bot or robot isn't an illness. It's the desire to have a "romantic" relationship, ie: intimate sex and/or a relationship involving feelings involving love, with an AI that would be a mental illness.


----------



## Hexigoon (Mar 12, 2018)

Scoobyscoob said:


> Diagnosing mental illness isn't subjective in clear-cut cases, which is the majority of cases. The only time discretion is required is with someone who may or may not be mentally ill. What is alive or not isn't subjective either. There may be debate on the definition of life, but what is considered living or not isn't subjective, its already defined. A bot or robot using AI, is not alive. Someone could have a romantic relationship with a robot with AI, but that's personifying what is essentially a sex toy that can offer a facsimile of real companionship. If that "romantic relationship" replaced a person's desire to have a romantic relationship with an actual person, then that person would have a mental illness.
> 
> Talking with an AI bot or robot isn't an illness. It's the desire to have a "romantic" relationship, ie: intimate sex and/or a relationship involving feelings involving love, with an AI that would be a mental illness.


For something to be a mental illness it has to be dysfunctional - that it causes significant distress or impairment of personal functioning. But preferring to have a romantic relationship with a machine rather than a human doesn't really say anything about causing distress or hindering one's ability to function in day to day life, anymore than one choosing to never date.


----------



## Scoobyscoob (Sep 4, 2016)

Hexigoon said:


> For something to be a mental illness it has to be dysfunctional - that it causes significant distress or impairment of personal functioning. But preferring to have a romantic relationship with a machine rather than a human doesn't really say anything about causing distress or hindering one's ability to function in day to day life, anymore than one choosing to never date.


You think wanting to have a sexual relationship with a robot isn't dysfunctional? Fulfilling companionship with an AI, yeah sure, but wanting to have sex with one and replace a real person. The person may not be harming others but would actively be engaging in self-harm. Which is very much a criteria when diagnosing a mental illness.


----------



## Hexigoon (Mar 12, 2018)

Scoobyscoob said:


> You think wanting to have a sexual relationship with a robot isn't dysfunctional? Fulfilling companionship with an AI, yeah sure, but wanting to have sex with one and replace a real person. The person may not be harming others but would actively be engaging in self-harm. Which is very much a criteria when diagnosing a mental illness.


Yeah, I don't have any reason to think that it's dysfunctional. You could say such relationships aren't "normal" at present time, but that's not the same thing as it being mental illness. I think if such a relationship is fulfilling enough for a person then they can decide that for themselves sanely enough.

What is meant by self-harm here? It's not self-harm in like having a compulsion to cut or starve yourself.


----------



## Scoobyscoob (Sep 4, 2016)

Hexigoon said:


> Yeah, I don't have any reason to think that it's dysfunctional. You could say such relationships aren't "normal" at present time, but that's not the same thing as it being mental illness. I think if such a relationship is fulfilling enough for a person then they can decide that for themselves sanely enough.
> 
> What is meant by self-harm here? It's not self-harm in like having a compulsion to cut or starve yourself.


Well, I'm thinking of wanting a relationship, sexual, romantic or otherwise with a robot rather than a person being more of an addiction than say cutting yourself or wanting to harm others or starve yourself, etc. An addiction to a fantasy person. Addictions are considered mental illness, so I'm not trying to be coy with my definitions here, but I am using the term correctly.

I believe I addressed that in the above paragraph.


----------



## Hexigoon (Mar 12, 2018)

Scoobyscoob said:


> Well, I'm thinking of wanting a relationship, sexual, romantic or otherwise with a robot rather than a person being more of an addiction than say cutting yourself or wanting to harm others or starve yourself, etc. An addiction to a fantasy person. Addictions are considered mental illness, so I'm trying to be coy with my definitions here, but I am using the term correctly.
> 
> I believe I addressed that properly.


One could argue though that love is a form of addiction. But we wouldn't say loving another person like crazy is mental illness (well I wouldn't anyway, not necessarily).
And maybe it's a fiction what they'd love about a robot, but even with people who love eachother rhey are arguably always in love with a sorta idealized image they've formed of that other person.


----------



## Scoobyscoob (Sep 4, 2016)

Hexigoon said:


> One could argue though that love is a form of addiction. But we wouldn't say loving another person like crazy is mental illness (well I wouldn't anyway, not necessarily).
> And maybe it's a fiction what they'd love about a robot, but even with people who love eachother rhey are arguably always in love with a sorta idealized image they've formed of that other person.


I think infatuation and limerence would qualify as an addictive stage of love, but familiarity and having bonded with a person does not. That's really more of an opinion of mine than anything based on fact though. Well maybe, I said I don't get it but I don't really care if someone wants to have a "romantic" relationship with an AI bot or robot. I think talking to an AI for companionship is fine but wanting to fall in love with a robot? Past companionship just seems like falling in love with a sex toy and wanting the toy over a person you love is a strange desire to me. I know the notion has been normalized since the conception of creating robots as possible sexual servants, but to me, logically, that's not sound of mind. Aside from familial/biological love, I'd say my wife is the person whom I've loved longer than anyone else I can think of. My kids are of course a no duh statement. Hm, I'm not sure if I idealize someone like say my wife but if I do it only makes me love her even more and I know she's pretty much the same way. So it's functional and serves a purpose. 🙂👍


----------



## recycled_lube_oil (Sep 30, 2021)




----------



## BenevolentBitterBleeding (Mar 16, 2015)

Squirt said:


> lol, I hope you’re not serious. I was thinking of your time, energy, and focus as well as mine. Unless you're having fun. Please, have fun. I'd like us to have fun.?


Serious in that AI would never tire of correcting you. 😗



> Because it is a tool? You can call it whatever you like, but that is what it is.


You can call it a tool, but that doesn't stop it from having possible benefits to a person as a 'relationship' partner.



> I'm not sure why carrying groceries is so much more impressive than turning a screw.


I'm not sure why turning a screw negates my counter to your original plant proposition. And if you're admitting that you've already conceded that AI provides more beneficial 'correspondence' compared to a plant, I'm not sure why you're still on about screwdrivers.



> Much of your original post was based on an idea for technology that doesn't exist and probably never will. So, I'm looking at the conceptual possibilities and purpose for developing such a technology, what that implies about the human condition, and more feasible ways to solve the underlying needs that bring about such fantasies.


Cool. But that's a completely different subject/perspective of where my post came from as I already mentioned to you previously that for the sake of this discussion/thread I was treating it from the point of view that the AI was in a 'working' state. Sigh.



> I’ve already explained the reason I brought up the plant: there are simpler and easier ways to accomplish the apparent benefits that don’t require replacing human company with artificial intelligence.


Sure, but so what? Why should I care about 'easy' as it pertains here or to my post which you originally post quoted it with as a suggestion?

The thread asks whether or not someone would date an AI. And my post was about how/why an AI partner would be desirable. Again, you brought up plant, and I've already stated why AI is preferable to that or any other x, y, z 'simpler' or 'easier' design.



> _For an example, _if you’re just looking for something to talk at, that won’t argue with you or say something you don’t like, a plant can do that just as well as AI, and is much less expensive and also better for the environment. That’s pretty much it for why I mentioned a plant.


Sure, but so what? Why should I care about less expensive methods?

And yes actually, I _expect_ and hope that the AI is able to argue with me. Then it'd actually be able to teach and/or out 'debate' me honestly, opposed to some 'real', 'complex' humans. *hinthintnudgenudge



> Depends on what you want to accomplish and if AI is an adequate tool for it. Specifically for romantic relationships....


Okay, so I take that as an agreement to my point.



> The reason it appears to you that goal posts are being moved is because I’m considering many angles of the topic rather than "hitting goals". I’m approaching you with an open and somewhat casual discussion, not a formal debate. I guess you can try to argue with your future AI, but it would confirm your beliefs rather than challenge them, as I understand.


Lol, what utter nonsense.



> Maybe I should get into a relationship with my water kettle. Or the water pump on the well-house. That might be pretty hot.


Wow, much impressed, what amazing use of consider many angler and not movement gol pots. 

You stated:_ "Like, I need water. That doesn't require a human."_

Yes, you/'we' do need 'humans' for your/'our' water. re: our needs re: labour re: something an AI can replicate.

Maybe you ought to try considering a couple more angles and then you'd be able to figure out how you/'we' actually get clean potable water. Or let me guess, you're a chemist, microbiologist, engineer, and blacksmith _too_? lol this is tiring.



> I'm assuming not being okay with the faultiness of humans is part of that drive towards developing a superior AI to interact with. Since that is created by humans, it is necessarily also faulty, especially if emulating humans.


It may or may not be one of the drives. And 'faultiness' might not be the right word in all cases. Is it a humans fault that they're incapable of processing certain information faster than an Ai can? Is it humans fault that - for the most part - they're currently naturally limited only to the visible spectrum?

It's a fallacy to think that something created by humans will be faulty because humans may inherently be faulty themselves.



> This is _impersonally _evaluating the inconsistency of expectations (that is why I mentioned cognitive dissonance).


Uh what? Lol. Keep sidestepping.

You:_ "So, you're okay with the faulty technology humans build, but not humans themselves?  Some cognitive dissonance going on there."_

As I pointed out, I never stated that I'm not okay with 'faulty' humans as a partner(re: humans in general). And to further point out now - yes, I am okay with faulty technology humans build in that I accept it whenever I encounter it(because it's prevalent), but I'd also replace it hastily if something better was available and it were within my means. 



> These are social judgements. What does AI solve here? So far, AI has become efficient for amplifying discrimination practices (real AI, not fantasy AI).


Uh what? Lol. The point was to show that there are people all over that are not okay with humans themselves due to their 'faulty programming'. It has nothing to do with what an AI can solve. It was only mentioned as an addendum to one of your goal posts.



> Are you referring to this?
> 
> "At this point it seems pretty normal, as well as inevitable. Like, if you think about the psychology behind online social personas/interactions, people already are conditioned to or are looking for reinforcement of their thoughts/beliefs/feelings/whatever from those they interact with _or where_ they interact - whether that be online or irl. Sure you can argue that we are at least seeking these 'comforts' from living, breathing creatures, but does it matter much when we're just receiving a kiss ass/'fronted' versions of a person?"
> 
> If I were to paraphrase: "People can be shallow and one-sided anyways might as well go with it."


It was more about criticizing that we are more and more like 'bots' ourselves, and that it makes sense why we'd be okay with actual bots that can mimic us; because we already accept human versions of them, at least superficially as it pertains to our 'fronted' social lives. If you take into account what I wrote after that, the paragraphs go hand in hand, in that I then gave a counter as to why actual bots might be preferred anyway. Hence the _'I don't know if I truly agree...'_



> The conditioning you describe is detrimental and should be mitigated, not encouraged... right?


Well, I guess that depends where one's priorities lie.



> I've already answered those questions.


No you haven't. But I think I'm okay with not having to read a bunch more goal posts and sidestepping.



> You asked the difference between a real and fake relationship (and asked again just above). That is one aspect of the difference. Why are you avoiding it?


No, I never asked what the difference was. You were the one who initially brought it up.



> You: _"I’m confused because you complain that people aren’t “real” enough, yet “realness” doesn’t seem to be a priority for relationships if you’re into AI bots, anyway"_
> 
> Me: _"Was I complaining that people aren't real enough, or saying that AI can be just as 'real' and more dependable? "_
> 
> ...




Lol this is much tiring.



> Also, just to be clear, I'm not claiming talking to a plant is having a real relationship with it.


Yet for some reason you post quoted my initial post regarding an AI for that purpose as a suggestion. Big brain move there. 😇



> Cars are popular, too:
> 
> Man has intimate relationship with his car on 'My Strange Addiction' (today.com)
> 
> Is a car a step up from a plant? We can evaluate getting cozy with cars instead of plants if you like.


Goal posts, strawman. What a joke.



> Well, yeah. Just having another toy is a thing, and as we've seen humans have a propensity for developing fetishes for inanimate objects. However, your first statements implied that it would be a _better substitute_ for human interaction because humans are shitty and whatever. For you or anyone else. Is that not what you meant? From your first post:
> 
> "I think people might say despite that sort of cynical view, 'real' is still better than robot,* but I don't know if I truly agree*; because _real_ robots(not those fake front humans) are far more likely to be a)'loyal' and b) not 'breakdown' in x respective number of ways you can think of(physically, emotionally, something else...) "
> 
> That is where I say it seems like a need for some kind of emotional security. Why else would you be concerned about loyalty or breakdowns or "non-fakeness"? So why don't we look at the emotional security problem?


No worries, now that you've reread my original and subsequent posts including this one where I think I clarified above about the two paragraphs going hand in hand, I'm glad you have a better understanding.



> I’ve already mentioned my thoughts about pets. I do think there can also be weirdness between a parent and child sometimes. Generally, the goal of the parent is to ensure the child is able to become independent as they age. They'll die if they aren't "under control" to some degree, and their dependency is temporary. Hopefully, the dependent state is not for the _sole pleasure of the parent_. Some parents do have children because they think of them as little dolls or want someone who will be forced to love them. I wouldn't say that is a good way to relate to a child. There are many instances of questionable relationships that occur where one party controls the other for their own benefit, which is unfortunate, and not to be advocated as a healthy form of relating to others, imo.


Jeez, that's a pretty lengthy way of conceding to the point that there are in fact relationships where it is ideal that one side of a relationship is in total control. Whatever, I'll take it, thanks.



> Therefore, please answer: Do you think it is "enriching" or "better" to have a companion simulating a human, which is under your control so as to be "reliable"? Or do you anticipate AI developers would build in a rebellious streak to avoid ethical questions... yet, wouldn't that defeat the purpose of "reliability"?
> 
> Disclaimer: This is not a goal-post shift, but a potential use of AI human substitution that I believe is detrimental that I think is interesting to discuss with you. It's _an issue._




1. Yes I do think it can be enriching or better to have a companion which is under my control, simulating a human. But for 'as to be reliable'? As you mentioned, reliable or dependable can mean a lot of things. Like, I rely on the hope that the AI won't kill me in my sleep.

2. It's not an either or to me. I do think later models will have 'variety' built in, but! I think if 'we' are smart about it, we'll still be building safeguards in order to keep them 'reliable'. The ethical question probably won't matter until the AI becomes self aware and is its own fully functioning independent entity re: 'alive' at which point I'd personally like to think the AI will still be 'reliable' to its 'owner/partner/friend(s)'.



> Sometimes. Lots of details there. There is evidence some of these technologies and how they are used are making people's lives worse in various ways. I believe this is true in my own experience, as well. That is why it is valuable to make these considerations of new technology carefully.


I don't disagree. But again, for the sake of the thread, in the view that we have 'working' AI, they are/would be - imo regarding my posts - a benefit as a partner.



> I think we covered this pretty well already. Using _for what_ is the question. Believing it can adequately substitute (healthy) human relationships in a wholistic sense is basically delusional, and additionally I'd rather put energy and resources into "developing" better humans to address unhealthy relationships rather than AI substitutions that will never compare. Like, Wickerdeer touched on potential uses of AI to facilitate mental health that had nothing to do with emotionally bonding with the AI.


Yes we've already gone over the point that an AI may not offer everything a human can, but that it is still able to provide - and in some cases exceed - benefits a human has to offer as a partner.



> If you're just talking about having another toy to play with, then whatever. Like I said, give that lady her dumb robot dog to tuck under her arm and carry around like a doll, if you must. Hexigoon's post was about romantic relationships, though, which seems a bit more serious intention of an application.


Well again, I've already mentioned in my initial post that wanting an AI robot doesn't necessarily mean that it's because someone wants to use it only sexually.



> Now if only you could catch up to mine, snarkypants.


Lol, the way you've been unable to concede or counter points, it's no wonder your original shot off the hip recommendation was to find a plant to talk to.


----------



## Mark R (Dec 23, 2015)

SilentScream said:


> TFW I see incels swooning over the concept of an AI sex slave when AI by definition is intelligent and would probably also want to self preserve and avoid incels like the plague.
> 
> We're going to go through the same sequence of events with AI as we have done with every being we've considered subhuman. And the concept of "buying love" from AI slaves is already gross and extremely unethical to me. 100% opposed.


Ethically, AI should provide a kind of personality that would challenge, heal, and improve their partner. 
:
Does the buyer whether they want to be challenged and improved or do they want a simple sex slave? Most people want their partners to respond realistically and authentically, but some unhealthy people may not. 

Would the AI programmers be open to legal litigation if the buyer ends up crazier and does something harmful as a result of their interaction with their product? 

Would these bots be designed with advertising to addict their users to McDonald's hamburgers or become deeper slaves of corporate overlords?


----------



## recycled_lube_oil (Sep 30, 2021)

Mark R said:


> Ethically, AI should provide a kind of personality that would challenge, heal, and improve their partner.
> :


But would that sell? If you were head of some massive AI corporation, would you pump billions, maybe trillions into something that would not sell, due to it being "ethical". Do product designers, manufacturers and sellers care about ethics, from what you have experienced in this world?



> Does the buyer whether they want to be challenged and improved or do they want a simple sex slave? Most people want their partners to respond realistically and authentically, but some unhealthy people may not.


Only market research will have the answer. Followed by trend analysis. Would be an interesting study I am sure, assuming people are honest about their intentions.



> Would the AI programmers be open to legal litigation if the buyer ends up crazier and does something harmful as a result of their interaction with their product?


I guess legal departments will be busy before these products are released. Will people even bother to read the terms and conditions? Do humans have a proven track record of not doing things they shouldn't with consumer products.



> Would these bots be designed with advertising to addict their users to McDonald's hamburgers or become deeper slaves of corporate overlords?


It depends, if they are free of charge, like the saying goes, "If you do not pay for the product, you are the product". These products are not going to fund R&D, Production, Marketting, Distribution and Support, by merely existing. The funds will need to come from somewhere. Its like Facebook and Youtube, yes they harvest data, but ultimately like most products, they exist to make money.


----------



## Dalien (Jul 21, 2010)

AI doesn’t actually have feelings, imagination, thinking on its own, etc.—don’t care how its programmed—not human, not even close.

So some are giving up on humans—sad, very sad.


----------



## recycled_lube_oil (Sep 30, 2021)

Dalien said:


> So some are giving up on humans—sad, very sad.


I actually disagree here. I personally believe the major market will be people who are already more reclusive. However in this day and age, yeah that seems to be the norm at times. But I think that is the effect of technology mixed with lockdowns, not just AI per say.


----------



## Squirt (Jun 2, 2017)

@BenevolentBitterBleeding yeah, I’m not really into passive aggressive “debates” and it is clear you have no insight to offer. Gave you the benefit of the doubt, but I’m out.


----------



## BenevolentBitterBleeding (Mar 16, 2015)

Squirt said:


> yeah, I’m not really into passive aggressive “debates” and it is clear you have no insight to offer. Gave you the benefit of the doubt, but I’m out.


No worries. Later.


----------



## Hexigoon (Mar 12, 2018)

Speaking of AI lacking human qualities like imagination, it would seem to be acquiring one.





It blows my mind. As an artist, I'm surprised I'm not scared by this making me obsolete, but I see ways how that kinda technology can empower human artists.


----------



## Squirt (Jun 2, 2017)

recycled_lube_oil said:


> But would that sell? If you were head of some massive AI corporation, would you pump billions, maybe trillions into something that would not sell, due to it being "ethical". Do product designers, manufacturers and sellers care about ethics, from what you have experienced in this world?
> 
> 
> 
> ...


These questions are more salient to me.

AI that becomes "better" at mimicking human behavior is going to amplify vulnerability to massive data harvesting practices for marketing and consumer advertising, as well as corporate and government surveillance monitoring, and even scams and fraud. The incredibly high cost of production and resource requirement for an advanced, human-like AI would restrict use either to the wealthiest individuals or require some kind of financial exploitation and consumer marketing (government backed or corporate, or jointly) to deploy on a large scale. Such an AI would be a height of opulence.

Take any current AI strategy for business and marketing, and then imagine what that would look like if they could advertise an AI that is directly and intimately "responsive to human needs" and difficult to tell apart from a real person, but programmable to execute specific tasks that would be perceived as trustworthy once AI of that nature is widespread (a computer cannot show a conflict of interest or hesitation if it "lies" or deceives on behalf of the programmer, keep in mind). What are the full implications, really? How much more convenient from a marketing standpoint would it be if a product doesn't facilitate a human need, but _becomes _the need? Is this really serving an individual, a community, a society?


----------



## Scoobyscoob (Sep 4, 2016)

Squirt said:


> These questions are more salient to me.
> 
> AI that becomes "better" at mimicking human behavior is going to amplify vulnerability to massive data harvesting practices for marketing and consumer advertising, as well as corporate and government surveillance monitoring, and even scams and fraud. The incredibly high cost of production and resource requirement for an advanced, human-like AI would restrict use either to the wealthiest individuals or require some kind of financial exploitation and consumer marketing (government backed or corporate, or jointly) to deploy on a large scale. Such an AI would be a height of opulence.
> 
> Take any current AI strategy for business and marketing, and then imagine what that would look like if they could advertise an AI that is directly and intimately "responsive to human needs" and difficult to tell apart from a real person, but *programmable to execute specific tasks that would be perceived as trustworthy once AI of that nature is widespread (a computer cannot show a conflict of interest or hesitation if it "lies" or deceives on behalf of the programmer, keep in mind). What are the full implications, really? How much more convenient from a marketing standpoint would it be if a product doesn't facilitate a human need, but becomes the need? Is this really serving an individual, a community, a society?*


AI already does this in the finance space and it's mostly used to stabilize market prices on the stock market and also monetarily benefit the people/companies that use them.


----------



## Kazuma Ikezawa (Oct 21, 2011)

Scoobyscoob said:


> Diagnosing mental illness isn't subjective in clear-cut cases, which is the majority of cases. The only time discretion is required is with someone who may or may not be mentally ill. What is alive or not isn't subjective either. There may be debate on the definition of life, but what is considered living or not isn't subjective, its already defined. A bot or robot using AI, is not alive. Someone could have a romantic relationship with a robot with AI, but that's personifying what is essentially a sex toy that can offer a facsimile of real companionship. If that "romantic relationship" replaced a person's desire to have a romantic relationship with an actual person, then that person would have a mental illness.
> 
> Talking with an AI bot or robot isn't an illness. It's the desire to have a "romantic" relationship, ie: intimate sex and/or a relationship involving feelings involving love, with an AI that would be a mental illness.


Really sorry for this late post. I've been busy the last two days. It also takes me a while to think through my thoughts before posting something.

What you said about mental illness not being subjective in clear cut cases is true, but I've read some literature about how some people consider certain mental illnesses, personality disorders to specific, as just a variation of personality. The example used was schizoid personality disorder. Following this idea, couldn't one consider, for some people in the this example of having relations with an AI robot, what most say is a mental illness as just a different way of thinking? I also would like to emphasize that this could be for some cases because everybody is different. Also, mental illness is a spectrum in which certain cases are subjectively considered different enough that they meet the criteria for a mental illness. For example, I learned that the amount of traits that one must have in order to be considered a psychopath is different in Canada than it is in North America. Homosexuality used to be considered a mental illness but now its considered normal. This all goes to show that mental illness is subjective not just for cases where one is not sure about the illness, but the whole construction of mental illness. Some things like anxiety and depression, everyone can see that it’s a mental illness and that its detrimental, but some other behavior or ways of thinking are more complex and people have different opinions on it. Mental illness, which deals with things as complex as concepts like the mind and what is normal, is tentative. That’s why the DSM, which is the standard for classifying mental disorders often changes with disorders sometimes being added, removed, or being combined with other disorders. In the future relationships with A.I robots might be considered normal. So for what we're talking about with having an A.I robot as a romantic partner, one could say that in some cases, ones mind could be able to view a robot as a living thing.

Also, my argument relies on the possibility that A.I robots can become advanced enough to reach the level of being alive and having sentience. I think that one could argue that what we call life is no different than what we call non-life. The meaningfulness of human life could be the same as a highly advanced A.I robot. By this I mean the philosophical argument of what criteria has to be met in order to be considered sentient and living could be subjective. Maybe even we are just as alive as rocks are and we are all equal. And if there is debate about the definition of life, then doesn’t that necessitate that what is considered living or not is subjective?

It's also possible that for certain individuals, advanced robot A.I could offer real companionship. There might be a time in the future where robots become advanced enough to the point that romantic and sexual relationships with them are normal.


----------



## Squirt (Jun 2, 2017)

Scoobyscoob said:


> AI already does this in the finance space and it's mostly used to stabilize market prices on the stock market and also monetarily benefit the people/companies that use them.


Do you mean how automated trading like HFT “replaces” trading by people and comes at a high cost barrier? Or the implication of proprietary algorithms and AI deciding value for the stock market? I don’t have much knowledge of finance, but is that sort of what you’re getting at?


----------



## Scoobyscoob (Sep 4, 2016)

Kazuma Ikezawa said:


> Really sorry for this late post. I've been busy the last two days. It also takes me a while to think through my thoughts before posting something.
> 
> What you said about mental illness not being subjective in clear cut cases is true, but I've read some literature about how some people consider certain mental illnesses, personality disorders to specific, as just a variation of personality. The example used was schizoid personality disorder. Following this idea, couldn't one consider, for some people in the this example of having relations with an AI robot, what most say is a mental illness as just a different way of thinking? I also would like to emphasize that this could be for some cases because everybody is different. Also, mental illness is a spectrum in which certain cases are subjectively considered different enough that they meet the criteria for a mental illness. For example, I learned that the amount of traits that one must have in order to be considered a psychopath is different in Canada than it is in North America. Homosexuality used to be considered a mental illness but now its considered normal. This all goes to show that mental illness is subjective not just for cases where one is not sure about the illness, but the whole construction of mental illness. Some things like anxiety and depression, everyone can see that it’s a mental illness and that its detrimental, but some other behavior or ways of thinking are more complex and people have different opinions on it. Mental illness, which deals with things as complex as concepts like the mind and what is normal, is tentative. That’s why the DSM, which is the standard for classifying mental disorders often changes with disorders sometimes being added, removed, or being combined with other disorders. In the future relationships with A.I robots might be considered normal. So for what we're talking about with having an A.I robot as a romantic partner, one could say that in some cases, ones mind could be able to view a robot as a living thing.
> 
> ...


Okay, so several things you've asserted in your second paragraph aren't accurate or at best deceptively untrue statements. I had to look up what schizoid PD is and I can see an argument for it being a variation on personality but I'm not a psychologist and the website I was looking at, the Mayo Clinic says someone with schizoid PD only needs to seek help if one develops depression due to the way that they are. Also, I made a specific claim about what would be considered being mentally ill for wanting a sexual and loving relationship and replacing human companionship with a robot but seeking companionship from a robot with AI would be normal if alone and lonely. There was similar controversy when realistic sex dolls were a topic, especially realistic sex dolls that resembled children and people were making the same arguments as you and @DazzlingDexter have been making thus far. 

Also, the majority of Canadian and American (US) mental health professionals use the DSM, so saying Canada uses different criteria to diagnose something like schizophrenia is factually untrue most of the time. Also, Canada is not a model for anything about mental health. In a lot of ways the way Canada handles mental health is more cruel than how it's handled in the US. I have an INFP half-cousin was basically tricked into moving to Canada then basically tortured by some psychologists "for science". She's fine now and lives in France with her husband and what happened to her is rare but not a one off case. Anyway, I don't want you thinking there is some conspiracy going on here, it was just a different time then.

Well true, there used to be psychological diagnosis that no longer exist today like mania, female hysteria and homosexuality but that's why I've been only referring to 20th century modern psychology. To say that psychology is thus subjective because it changes is a silly argument reaching for a conclusion that would be plainly wrong today. Because gaining knowledge and improving our understanding of people and nature is how science works. Also, just to be clear, I've already said I don't care if someone wants to have a relationship with an AI bot or robot. I just think it's strange and in some cases would be considered a mental illness, not necessarily schizoid PD or schizophrenia but something that probably should be addressed.

Also, the DSM doesn't change often. The latest version is the DSM-5 and I believe that manual is ten years old now. Also, I think you're being too invested in relationships with A.I. robots because wanting to be with a robot instead of another person will never be considered normal. 😂 You seem to like citing edge cases then try to apply that to everyone. Has fantasizing about robots been normalized? Yes, if you're into science fiction and anime, but most people are not into anime and science fiction, at least not this kind of topic. I've heard lengthy and sometimes heated debates about sci-fi futures where we'll all be clones of one another and reproduce asexually or we'll all be hot killer robots like in Nier: Automata, but that's all really just fiction. Maybe an enticing future for some but a horrifying and depressing future for most. 😂

Also, if someone wants to have a relationship with an A.I. robot then that's on them. I just think it's weird. Trying to remove your most instinctive biological imperative to reproduce calls into question the very nature of someone's biology. Maybe people who want to have sex with A.I. robots and love them are just the next evolution of human beings and I'm just a dinosaur of an archaic human talking here, but I have a feeling that humans will still be more or less the same, even in say 150 years. People in the future for sure will have a lot cooler toys and more fun ways to be entertained, be exploring and colonizing space and maybe some people will prefer the company of an AI robot but I'm fairly certain people will be more or less the same as people are today.

Maybe philosophically the definition of life might need to be debated upon again some time in the future but biologically, ie: scientifically there really isn't other than in a few cases that don't really fit the definition of life. Also, good conversation, I enjoyed writing this reply. 🙂


----------



## Scoobyscoob (Sep 4, 2016)

Squirt said:


> Do you mean how automated trading like HFT “replaces” trading by people and comes at a high cost barrier? Or the implication of proprietary algorithms and AI deciding value for the stock market? I don’t have much knowledge of finance, but is that sort of what you’re getting at?


HFT mostly but I've heard from people who develop trading algorithms that they also use AI. HFT has been using AI for a while now. Also, HFT doesn't replace all traders, only people who trade the news, and even then not completely. It's just not as profitable as it used to be. So yes, that was what I was getting at.


----------



## recycled_lube_oil (Sep 30, 2021)

Scoobyscoob said:


> Okay, so several things you've asserted in your second paragraph aren't accurate or at best deceptively untrue statements. I had to look up what schizoid PD is and I can see an argument for it being a variation on personality but I'm not a psychologist and the website I was looking at, the Mayo Clinic says someone with schizoid PD only needs to seek help if one develops depression due to the way that they are. Also, I made a specific claim about what would be considered being mentally ill for wanting a sexual and loving relationship and replacing human companionship with a robot but seeking companionship from a robot with AI would be normal if alone and lonely. There was similar controversy when realistic sex dolls were a topic, especially realistic sex dolls that resembled children and people were making the same arguments as you and @DazzlingDexter have been making thus far.
> 
> Also, the majority of Canadian and American (US) mental health professionals use the DSM, so saying Canada uses different criteria to diagnose something like schizophrenia is factually untrue most of the time. Also, Canada is not a model for anything about mental health. In a lot of ways the way Canada handles mental health is more cruel than how it's handled in the US. I have an INFP half-cousin was basically tricked into moving to Canada then basically tortured by some psychologists "for science". She's fine now and lives in France with her husband and what happened to her is rare but not a one off case. Anyway, I don't want you thinking there is some conspiracy going on here, it was just a different time then.
> 
> ...


No comment regarding Psychology as you know more than me. 

Anyway.... regarding people wanting relationships with AI. There are people out there who do not believe they can have a relationship with a woman. Be that out of low self esteem, no self worth or hatred of women. I do not know the specifics and can only speculate. 

However, I believe that will be the market that this kind of thing is geared towards if it ever materialises. However, the way this topic has gotten onto this is quite fascinating. As the original video posted (at least what I watched of it) seemed more about simulating the deceased. This I believe could become more of a thing, however probably not something talked about. But it would be one way of dealing with the pain of losing someone. Or maybe, a new cure for oneitis (not sure cure is the correct term here). 

But its an unhealthy way to deal with loss....... yes it is. So is alchohol, drug abuse and various other things that people currently do to deal with pain by sticking on a bandaid instead of dealing with the issues. SO hey, this could just be a new way of dealing with it.

OK, I lied. I said I wasn't going to mention the psychology side of things. But you stated it is wierd and mental illness if people want to replace human companionship with AI. We already live in a world where WAIFU is a thing. Also things like the below, already happen:









Japanese men marry anime characters in a VR wedding


They say love knows no boundaries. That philosophy holds especially true for these men, who can now marry anime characters in a Virtual Reality (VR) wedding. According to LADBible, the trend started due to the VR game, Niizuma Lovely x Cation. In this game, players are able to create meaningful...




www.asiaone.com





So, yeah maybe it is mental illness. I personally do not see it as a good thing for humanity moving forward. But I guess as long as nobody is hurt and they can still be productive and pay taxes, nobody is really going to care too much. It is only if something bad happens, that I personally believe it will be spun in a negative way.

Just my two cents.


----------



## Scoobyscoob (Sep 4, 2016)

recycled_lube_oil said:


> No comment regarding Psychology as you know more than me.
> 
> Anyway.... regarding people wanting relationships with AI. There are people out there who do not believe they can have a relationship with a woman. Be that out of low self esteem, no self worth or hatred of women. I do not know the specifics and can only speculate.
> 
> ...


Yeah I understand who the target market likely is. I think most "incels" would also be turned off by the idea and I get the impression that most people that might consider it just really want a human looking robot and/or for the new experience. When I think of who will want to buy an AI, I think of an INTP person I'd consider a friend, who I met in the UK. He's heterosexual, by his own admission good looking and when we hung out, women did approach often to want to exchange numbers. He would never pursue any of the women who approached him though, and I'm not entirely sure why but the reason he told me was because he was waiting for a robot he could have a relationship with. So when realistic looking dolls were available for sale he bought one. When I asked him how it was, the disappointment on his face was obvious but he was hopeful the "real thing" would be better. 😄 I think for him though, he mostly didn't want to do the same things as everyone else then eventually end up, in his own eyes, a "nobody". Even though he has a lot of friends, is well regarded at his work and women clearly showed interest in him. I haven't met or talked with said friend in a long time but I hope he's realized by now that doing what everyone else does isn't really all that bad. I'm guessing he will still want an AI robot if/when he's able to buy one though. 😄

Yeah I would be very much against having an AI replace a loved one. I've seen sci-fi shows trying to imagine what replacing a deceased loved one with an AI and the very notion of doing such a thing makes me deeply unhappy. Because if an AI will even replace someone's memories of a loved one then we're all in trouble as a civilization. Mostly though I think something like that will remain fiction except for a very few who would want an AI to simulate a deceased loved one. I mentioned in this thread that I'm friends with an ENTP who does government (US) research on creating an AI 'singularity' that incorporates every person it interacted with or was fed information on. Although he would frequently reset the AI's "mind" and one time he tried to create an AI personality using primarily my personality from answers I gave him from questionnaires he'd give me and also a bit of his personality and the result was pretty hilarious. 😄 The AI would be very formal then do something seemingly random like make an insult or swear or give errant and unsolicited advice, or just start talking about something off topic, like what I'm doing right now, hahah. I'm sure he's still doing work with AI but he's more than likely moved on to other things. The last time I spoke with him, he said he was going to shelve his attempt at creating an A.I. version of my personality. 

Well if you admit that it's an unhealthy then not trying to convince or coerce someone into wanting a "replacement" using A.I., which only serves to remove the person further from reality and should seek therapy instead. People using and abusing alcohol and drugs and can't quit on their own require treatment as well.

Yeah, when it comes to marrying a fictional character or VR character or w/e, it always seems to be a Japanese person and usually a guy. I'd say that's mostly just a phase or like my INTP friend I mentioned earlier of not wanting to be a "nobody" or make headlines, be recognized, etc.

Yeah, I don't care if someone wants to be with a robot. Like I said, I just think it's a strange desire.


----------



## Dalien (Jul 21, 2010)

recycled_lube_oil said:


> I actually disagree here. I personally believe the major market will be people who are already more reclusive. However in this day and age, yeah that seems to be the norm at times. But I think that is the effect of technology mixed with lockdowns, not just AI per say.


In the respect of market, yes, I could see that AI sex dolls/robots would fit as a major market.
This does not refute that I said “some” …


Dalien said:


> AI doesn’t actually have feelings, imagination, thinking on its own, etc.—don’t care how its programmed—not human, not even close.
> 
> So some are giving up on humans—sad, very sad.


Reclusive people need non-human AI sex dolls—interesting indeed.
Why do people become reclusive to begin with?
It‘s not just technology; even though, it doesn’t or may not help. 
Some people were reclusive before the Internet/technology.
You know people do have a choice to how reclusive they want to be.

If an AI sex doll/robot is someone’s cup of tea, so be it. 
It‘s still sad to me.


----------



## ButIHaveNoFear (Sep 6, 2017)

Homosexual, heterosexual, bisexual, pansexual, _robo_sexual? 

Robosexuality


----------



## Dalien (Jul 21, 2010)

ButIHaveNoFear said:


> Homosexual, heterosexual, bisexual, pansexual, _robo_sexual?
> 
> Robosexuality


Left out sapiosexual.


----------



## recycled_lube_oil (Sep 30, 2021)

ButIHaveNoFear said:


> Homosexual, heterosexual, bisexual, pansexual, _robo_sexual?
> 
> Robosexuality


Didn't know it was a recognised thing already.


----------



## eeo (Aug 25, 2020)

Dalien said:


> Reclusive people need non-human AI sex dolls—interesting indeed.
> Why do people become reclusive to begin with?
> It‘s not just technology; even though, it doesn’t or may not help.
> Some people were reclusive before the Internet/technology.
> ...


Thinking of robots as only sex dolls is too one-dimensional. There will certainly be people who only want sex from robots. There will be people who only want a personal domestic slave. But there will also be people who want so much more.

It's sad to you because you can't see yourself being happy with the AI solution. But there are people who can't see themselves being happy with the human solution (or the pet/plant solution). Being human is to be unreliable. One person cannot fulfil all your needs, you'll always have to compromise, and deal with constant disappointment in some ways, and maybe never find enough people to be content. Some can't imagine living life any other way, they enjoy the constant search, and juggling multiple people to meet their needs. For others it's also sad to see people try, have some positive moments, sometimes fail, or lose parts of themselves in the process of trying to connect with others, and they want nothing to do with it. These are all choices people can freely make.

If the technology is advanced enough, an AI might just be able to fulfil your needs as one versatile entity. It doesn't have to mean that there's no contact with humans any more. Although, it might be tempting for some, for sure. But as being a recluse isn't healthy, and it doesn't work for everybody, neither would be exchanging humans for AI only.


----------



## WickerDeer (Aug 1, 2012)

The above post reminded me of how I tend to focus on interpersonal and intrapersonal stuff...and I find that when I'm learning something new, it can help to have that more emotional/personal approach to it. Like for example, taking a class with an instructor vs. just reading a book of instructions, that I don't feel any kind of personal connection to.

Some books, like historical fiction sort of bridge between dry non-fiction and something more intrapersonal/interpersonal (since there are characters and plots to emotionally relate to etc.

It made me wonder if that's why AI might be something I'd attend to more than some kind of sticky note or something. As I said, I already say please and thank you to Siri.

And then I started thinking about how books are considered creative expression, and perhaps even artistic expression, so there is that human element of connecting with the author.

I wonder if AI or robots would go in that direction--be seen as artistic or creative expressions of the designer. That perhaps you could connect with a robot similarly to how you might connect with a book--understanding that the character is fictional, but that your interpretation brings them alive.

Then I started wondering what a creative/artesional robot could be like--it could be many different colors, and just have sort of a human-like face. It could be a head with wings, it could have long glittery rainbow hair. I guess whatever the designer would make it like. It could speak like Yoda or it could only speak in rhymes. Or could have some other method of communication that expresses a feeling or a mood, or a characteristic the creator wanted.

We seem to be most interested in copying humans...which is sort of how literature and visual arts probably advanced at certain times in Western history (not sure about other cultures), but then with art it became abstract again or more subjective, and less focused on photo-realism in general once we did have advanced technology like photography. And now we tend to think of paintings more as artistic expressions and not so much as portraits meant to perfectly capture the likeness of some ideal person...or whatever.


----------



## Dalien (Jul 21, 2010)

eeo said:


> Thinking of robots as only sex dolls is too one-dimensional. There will certainly be people who only want sex from robots. There will be people who only want a personal domestic slave. But there will also be people who want so much more.
> 
> It's sad to you because you can't see yourself being happy with the AI solution. But there are people who can't see themselves being happy with the human solution (or the pet/plant solution). Being human is to be unreliable. One person cannot fulfil all your needs, you'll always have to compromise, and deal with constant disappointment in some ways, and maybe never find enough people to be content. Some can't imagine living life any other way, they enjoy the constant search, and juggling multiple people to meet their needs. For others it's also sad to see people try, have some positive moments, sometimes fail, or lose parts of themselves in the process of trying to connect with others, and they want nothing to do with it. These are all choices people can freely make.
> 
> If the technology is advanced enough, an AI might just be able to fulfil your needs as one versatile entity. It doesn't have to mean that there's no contact with humans any more. Although, it might be tempting for some, for sure. But as being a recluse isn't healthy, and it doesn't work for everybody, neither would be exchanging humans for AI only.


Robotical sex dolls are AI—sex dolls (blow up dolls) have been around a long time. The OP is basically about them AI (robotical) sex dolls. Yes, I wrote of Internet/technology—it is the driver of more and more reclusive values. Humans are a funny thing—why do we want to alienate our individual selves from each other and this includes the natural world? Do you think it would be helpful to just go “oh, we have an out“, instead of trying to work out our own issues or understanding the why of?
I believe the human touch is very important…
(there’s more to it—just a simple connection…)


> There are studies showing that *touch signals safety and trust, it soothes*. Basic warm touch calms cardiovascular stress. It activates the body's vagus nerve, which is intimately involved with our compassionate response, and a simple touch can trigger release of oxytocin, aka “the love hormone.”


Do we loose love, understanding, emotional well being, etc. through too much AI? Where is the line between utilizing AI and using all of our psychological, physiological, emotional intelligences—a connection can be made with AI and our whole being but complete isolation looses our selves and thus the interpersonal relationships that humans have shown time and time again. How far do we really want to go? This to me shows that we are losing the human touch (and it’s not just sexual—it is sensual though). This is sad. Yes, not everyone will go this route, but it is still sad that there are people who do. I’m not looking down on these people—I just think they truly need to think about why they are doing this loosing of the human touch. Who knows, maybe, some will try the doll and go “oh, I’m still not fulfilled” or some will go “yep, this is just what I want”. 
My biggest point is …
It‘s sad that we want to loose touch with being human in a natural world.


----------



## Dalien (Jul 21, 2010)

WickerDeer said:


> The above post reminded me of how I tend to focus on interpersonal and intrapersonal stuff...and I find that when I'm learning something new, it can help to have that more emotional/personal approach to it. Like for example, taking a class with an instructor vs. just reading a book of instructions, that I don't feel any kind of personal connection to.
> 
> Some books, like historical fiction sort of bridge between dry non-fiction and something more intrapersonal/interpersonal (since there are characters and plots to emotionally relate to etc.
> 
> ...


I don’t believe we can copy humans without the error of not being able to instill the intangible in our selves in AI.


----------



## eeo (Aug 25, 2020)

Dalien said:


> Do you think it would be helpful to just go “oh, we have an out“, instead of trying to work out our own issues or understanding the why of?


Maybe it would be helpful to think of it more like the next step in ever-evolving human existence rather than some easy way out of human suffering. There doesn't have to be anything negative, toxic, or unresolved to choose it.


----------



## WickerDeer (Aug 1, 2012)

Dalien said:


> I don’t believe we can copy humans without the error of not being able to instill the intangible in our selves in AI.


Yeah--that's why I was kind of thinking of it like portrait painting and photography. It's not a real copy, but rather an artistic expression or an attempt to record.

Only AI is more dynamic and a lot different, of course.


----------



## Dalien (Jul 21, 2010)

eeo said:


> Maybe it would be helpful to think of it more like the next step in ever-evolving human existence rather than some easy way out of human suffering. There doesn't have to be anything negative, toxic, or unresolved to choose it.


Ever-evolving human existence? We are not that damned evolved to begin with—


Dalien said:


> I don’t believe we can copy humans without the error of not being able to instill the intangible in our selves in AI.


—if we don’t understand the why, how is it evolving?
Technology may be evolving but we are not—nowhere near as fast as that, nowhere, not even.
Humans are so damn self-destructive. The easy way of not looking at the why and the consequences there of doesn’t bode well for the psycho/physical/emotional well-being of humans. 
I‘m going to say that you think what I say is negative—it is not. It’s throwing things into a positive by knowing/understanding our damn choices. I’m not pissed off—I’m passionate about it.


----------



## eeo (Aug 25, 2020)

Dalien said:


> Ever-evolving human existence? We are not that damned evolved to begin with—


I'd say we're pretty evolved already if we're able to discuss hypothetical future romantic relationships with AI. By the time such AI actually exists, maybe people will be evolved enough to accept it too.



> Humans are so damn self-destructive.


And that's why I'm not holding my breath that any of this will ever get beyond hypothetical talk.


----------



## Dalien (Jul 21, 2010)

WickerDeer said:


> Yeah--that's why I was kind of thinking of it like portrait painting and photography. It's not a real copy, but rather an artistic expression or an attempt to record.
> 
> Only AI is more dynamic and a lot different, of course.


I just can’t help but to think about this: When I hold a writing instrument in my hand, I’m connecting through it—the movement of a pencil sliding across paper and the outcome being the connection of my mind trying to express the intangible. When I use a key board, it’s not quite relaxed and flowing; even though, I can understand that there is a type of flow, but it’s more ragged/uneven—even with the keyboards of typewriters before the touchscreen keyboards. This flow I’m talking about is an intangible that I think we all need. Yeah, sure utilize the technology but don’t forget to use the things that are intangible.

Anecdotal: My nephew moved up in the world from the lower side of town from whence he was born. I’m keeping this easy or lesser details/general to avoid derailing and getting into political social issues—you’ll see why in a moment. And, no I don’t begrudge him this. The thing is he was going on how this town from whence he was born was so damn horrible. I told him “Never to forget where he came from.” He, actually, understood what I meant. It taught him how to be strong and have the motivation to move him to better himself. We need to learn the basics to develop the skills to enrich our inner world and outer world.

Yes, we use outer objects like paper/pencils and computers to create thoughts/ideas/expressions, but what was before each evolving should be remembered how to use—because the new evolving system can crash—especially technology because of what it uses to keep it.

How would AI be dynamic? I’m very curious as to this word choice. Progress? Outside progress without the inside progress?


----------



## WickerDeer (Aug 1, 2012)

Dalien said:


> I just can’t help but to think about this: When I hold a writing instrument in my hand, I’m connecting through it—the movement of a pencil sliding across paper and the outcome being the connection of my mind trying to express the intangible. When I use a key board, it’s not quite relaxed and flowing; even though, I can understand that there is a type of flow, but it’s more ragged/uneven—even with the keyboards of typewriters before the touchscreen keyboards. This flow I’m talking about is an intangible that I think we all need. Yeah, sure utilize the technology but don’t forget to use the things that are intangible.
> 
> Anecdotal: My nephew moved up in the world from the lower side of town from whence he was born. I’m keeping this easy or lesser details/general to avoid derailing and getting into political social issues—you’ll see why in a moment. And, no I don’t begrudge him this. The thing is he was going on how this town from whence he was born was so damn horrible. I told him “Never to forget where he came from.” He, actually, understood what I meant. It taught him how to be strong and have the motivation to move him to better himself. We need to learn the basics to develop the skills to enrich our inner world and outer world.
> 
> ...


Well AI is by definition supposed to be changing--it isn't static like a portrait. Not only is it dynamic physically--as a robot is moving and acting in a less static way than a photo or portrait, but it's also supposed to have a component of changing itself with new information.

So that is really all I meant by dynamic.

I would remind that pens are also technology. We used to use inkwells and pens with nibs and those are mostly replaced by ballpoint. Before nibs, we used feathers or something? And probably paintbrushes more--I don't really know what illuminated texts were created with. Before that we used charcoal from fires and red earth pigments, like you see on the walls of caves. Before that we used rocks that we broke in ways or chipped spots into, and perhaps more impermanent pigments that haven't survived the test of time...face paints etc.

So technology is always changing and humans do need to control it and adapt to it as well...I think the internet is huge as far as how it's affecting humanity and not sure I really think AI is going to be a bigger influence than that though I think some do--I think Elon Musk said something like that.

But just the internet itself is like the new printing press--with the memes and the different ways that it can be regulated and controlled, and by who and what countries etc.

Personally, I really like charcoal--I love drawing with charcoal and there's just something so rewarding about it. I like digital painting too, but there's no way to really replace charcoal drawing with digital.

So idk--but it's just interesting for me to think of AI as a new art media--so instead of just painting a portrait you are painting a personality that's supposed to be dynamic...or an algorithm, or something that reacts to the world...

It's weird to think about, but people have been trying to create things that come close to humanity for a long time. From the myth of Pygmalion and his sculpture who he wanted to bring to life, to portraits like the Mona Lisa who people claim has some charming smile, to maybe...robots? 

But I guess maybe robots tend to be so complex that it's not just ONE person's expression--like when you have a pen, it's just you that the pen is channeling, and whatever you are channeling. But then you have something like a Hollywood movie and it's multiple creators. You have a robot and it's probably usually also going to be a collaboration.

And then you also have the issue of mass production, which is why I think the internet is similar to the printing press, because the internet also allows individuals more freedom to mass produce (spreading viral memes etc.) I guess hypothetically an AI could reproduce itself--it could create copies of itself and so that takes mass production to another level too.

idk just a bunch of random thoughts.

I don't like it very much but I'm not much of a technology person. I only like technology when it serves a good, human purpose--a purpose that is good for individuals, humanity, other creatures on earth, the earth itself, the entire solar system and universe etc.

So I do think that we need to temper or creation of technology with sentiment and consideration for how it's going to affect us, which I think is part of what you're saying.

Like there is some story about Prometheus who brought humans fire--and he was punished by the Gods. I think it's partly a metaphor because fire, like technology, can be used for good--it can keep humans warm. But it can also be used for war and terrible reasons. 

So technology needs to be balanced with thoughtfulness, ethics, and consideration for how it is used. Progression of technology just for the sake of progression is empty--it needs the human considerations too...what are we using it for? What purpose is it going to have? How is it going to affect us and other species?

People seem not to want to think of this too much, but even Elon Musk (as much as I don't agree with him most of the time) said we should start thinking about how to regulate AI now, before it's too late.


----------



## recycled_lube_oil (Sep 30, 2021)

Dalien said:


> Ever-evolving human existence? We are not that damned evolved to begin with—
> 
> —if we don’t understand the why, how is it evolving?
> Technology may be evolving but we are not—nowhere near as fast as that, nowhere, not even.
> ...





eeo said:


> I'd say we're pretty evolved already if we're able to discuss hypothetical future romantic relationships with AI. By the time such AI actually exists, maybe people will be evolved enough to accept it too.


If you believe in the Theory of Evolution, then yeah we are pretty damn evolved. But are we still evolving? No not really.

@eeo Just because we come up with more and more intelligent conversations that is not really evolution. The fact that we have the time to do this, kind of proves that we are not running at our full potential. We actually have time to be bored and think of this stuff. How the hell is that evolution? How does that make us evolve?

@Dalien I think you have hit the nail on the head here. We ourselves are not evolving, but technology and the landscape around us is. We humans, however are still driven by primal needs. In this day and age, why does wealth and greed matter so much. Sure when resources were scarce, those instincts mattered, it really was life and death. In 3rd world shitholes and countries at war, yeah I imagine those instincts serve people pretty well. But here, in the "evolved/developed countries", we are running on wetware, that is 4 million years past its shelf life.

If we were really evolving, we would not still have our instincts as most of them are no longer needed. I am not saying we "should" be living in a Utopia or some shit like that, but if we were truly evolved I believe we would be nearer a Utopia than we are now.

Evolution requires natural selection, and in this day and age, there is none. Hell we invented medicine, which makes sure we do not evolve. Specimans that should be weeded out of the genepool are not as we make sure they stay. I am not against medicine for the record, but when talking about evolution, yeah, there is nothing evolving.

If anything iu would say we are devolving.

If we were evolved, we wouldn't need technology. If we took away technology, I think most of us wouldn't survive. 

Anyway, here listen to some music by Fatboy Slim:


----------



## eeo (Aug 25, 2020)

recycled_lube_oil said:


> If you believe in the Theory of Evolution, then yeah we are pretty damn evolved. But are we still evolving? No not really.
> 
> @eeo Just because we come up with more and more intelligent conversations that is not really evolution. The fact that we have the time to do this, kind of proves that we are not running at our full potential. We actually have time to be bored and think of this stuff. How the hell is that evolution? How does that make us evolve?


So we were evolving, and then we simply stopped because life got too good and boring? Were we ever running at our full potential?

However small, changes are still going to happen. Maybe not what you'd expect, not as fast as you'd like, considering the potential we do have, but still, we're not really as static as we appear. And you'd ony be able to judge the present moment and our ability to evolve some time in the very distant future.


----------



## Squirt (Jun 2, 2017)

Tool use is an evolutionary advancement - but we’ve mastered reproducing ourselves far before that.

I imagine that if AI were to become so advanced as to simulate humans, and humanity went extinct, the first thing the AI would do is recognize the efficiency of organic life and set the work re-creating a human.


----------



## Dalien (Jul 21, 2010)

Squirt said:


> Tool use is an evolutionary advancement - but we’ve mastered reproducing ourselves far before that.
> 
> I imagine that if AI were to become so advanced as to simulate humans, and humanity went extinct, the first thing the AI would do is recognize the efficiency of organic life and set the work re-creating a human.


As long as they were programmed to do that? Hope so.


----------



## daleks_exterminate (Jul 22, 2013)

Hexigoon said:


> By 2030 and onward, as computer bots become more life-like and less stereotypically robotic, human relationships with AI will be quite common indeed. But even today there's examples of humans developing feelings for chatbots who aren't even quite at that level yet.
> 
> 
> 
> ...



Does this mean we can all have a refrigerator @Miharu , because I'm here for that.


----------



## Miharu (Apr 1, 2015)

daleks_exterminate said:


> Does this mean we can all have a refrigerator @Miharu , because I'm here for that.


She will learn your eating habits and make ice cubes for you. Just don’t store wine in there cause it’ll either disappear or automatically become a sangria. 😏


----------



## Mark R (Dec 23, 2015)

I think the big problem with this discussion is that people are all over the place as far as what we are discussing. I don't think we are talking about blow-up sex dolls, but rather post-singularity companions that may have similar to human intelligence. These would be more similar to the Blade_Runner replicants, robots from Isaac Asimov books, Data from Star Trek, or something from the "Terminator" series. Most likely these would have a combination of biological and non-biological parts. We are approaching a time when the world will completely change in so many ways. These "sex dolls" have to viewed in the context of this revolution.


----------



## Dalien (Jul 21, 2010)

Mark R said:


> This actually fits in pretty well with Christian scripture about the end of the world. I think you are right about this leading to destruction.


Isn’t that brimstone and fire—could be in a metaphorical sense though. Self-destruction is what I should have said. I guess that fits in religion, yet it doesn’t have to be religious. Could be just plain old human—psychological, which we humans created. 

Sex-bots or companion and everything else that comes would be classified as our demons devoid of humanness. Call them dead eye humans.


----------



## Mark R (Dec 23, 2015)

Dalien said:


> Isn’t that brimstone and fire—could be in a metaphorical sense though. Self-destruction is what I should have said. I guess that fits in religion, yet it doesn’t have to be religious. Could be just plain old human—psychological, which we humans created.
> 
> Sex-bots or companion and everything else that comes would be classified as our demons devoid of humanness. Call them dead eye humans.


I agree that the dangers of the dark side of singularity, whether literal, metaphorical, or psychological, are very serious. The dangers may be underestimated by most. Throughout history, the fears of the Luddites have been usually been unfounded, but this time may be different.


----------



## recycled_lube_oil (Sep 30, 2021)

Mark R said:


> I agree that the dangers of the dark side of singularity, whether literal, metaphorical, or psychological, are very serious. The dangers may be underestimated by most. Throughout history, the fears of the Luddites have been usually been unfounded, but this time may be different.


the Boy who cried Wolf maybe strikes again.


----------



## Dalien (Jul 21, 2010)

Mark R said:


> I agree that the dangers of the dark side of singularity, whether literal, metaphorical, or psychological, are very serious. The dangers may be underestimated by most. Throughout history, the fears of the Luddites have been usually been unfounded, but this time may be different.





recycled_lube_oil said:


> the Boy who cried Wolf maybe strikes again.


May be just that. Yet, if a warning sound is not given, what then? This thread was/is about sex-bots and the thought of replacing the human touch with AI that I’ve expressed still stands—dead eye humans that have no soul. It may seem that it would be beneficial to some of the very lonely people whom don’t deal with other humans very well, but is it really? Would it aide hermits; would they still be hermits? How have they gotten along thus far? What feeds their psychological needs?

I probably should have said chat-bots in my above paragraph—either one is conceptually the same given that they are replacing the human touch—human interaction whether that be physical or psychological because they both affect the psyche. You don’t have to psychically touch someone in order to touch them psychologically—those damn intangibles.

Now, would all this be Luddites?

Replacing workers, which I have done as I explained before in this thread allowed me to witness how the humans felt about it all. It affected their livelihood; their pride, if you will (ability to be productive, in this sense); their thought of where the fuck am I going now—factory work to where; their psyche was definitely attacked, in a sense, and no they were not very happy they were no longer needed. It dropped from 325 or so people working to maybe 25 or so. Not cool at all.

The difference between replacing workers which are humans and replacing humans for humans is quite a distinction, wouldn‘t one say.

How far do we humans go—where is the line.


----------



## Mark R (Dec 23, 2015)

These are unprecedented times. Before, machines could take over our physical labor. Soon, machines will take over our intellectual, emotional, and creative labor. No one will be needed or wanted to work because machines will do everything so much better. Will the ultra-rich even want to share their wealth? 99.9% of the world may live in unbearable poverty or die off. There are other possible outcomes. But in general, wealth becomes more concentrated.


----------



## recycled_lube_oil (Sep 30, 2021)

Mark R said:


> These are unprecedented times. Before, machines could take over our physical labor. Soon, machines will take over our intellectual, emotional, and creative labor. No one will be needed or wanted to work because machines will do everything so much better. Will the ultra-rich even want to share their wealth? 99.9% of the world may live in unbearable poverty or die off. There are other possible outcomes. But in general, wealth becomes more concentrated.


AT least we won't have to work for "The Man". It will be great right?

I wonder if AI can be made to become consumers or at least a percentage of them. At least then profit margins won't be affected. Like, why are humans even needed for profit?


----------



## Mark R (Dec 23, 2015)

recycled_lube_oil said:


> AT least we won't have to work for "The Man". It will be great right?
> 
> I wonder if AI can be made to become consumers or at least a percentage of them. At least then profit margins won't be affected. Like, why are humans even needed for profit?


I don't think any formula for maximum profits has ever included "a consumer with money in his/her pocket" as a factor.


----------



## ButIHaveNoFear (Sep 6, 2017)

I need to stop reading this thread. I keep having realistic dreams about having a robot partner, and then I feel disappointed when I wake up!


----------



## recycled_lube_oil (Sep 30, 2021)

DeepMind Develops An Artificial Intelligence (AI) System That Learns By Imitating Human Interactions


In the latest DeepMind research, the research group constructed a 3D virtual environment 'made of a randomized set of rooms and a large number of home interactable objects to give a place and context for humans and AI to interact together,' according to the researchers. Humans and AI can...




www.marktechpost.com


----------



## recycled_lube_oil (Sep 30, 2021)

Well, people did say this would happen (not starting a new thread due to recent events on here):









Men Are Creating AI Girlfriends and Then Verbally Abusing Them


A grisly trend has emerged: users who create AI partners, act abusive toward them, and post the toxic interactions online.




futurism.com


----------



## Dalien (Jul 21, 2010)

recycled_lube_oil said:


> Well, people did say this would happen (not starting a new thread due to recent events on here):
> 
> 
> 
> ...


It’s a mad mad world from whatever way one looks at it, according to me.


----------



## islandlight (Aug 13, 2013)




----------



## recycled_lube_oil (Sep 30, 2021)

Scientists Have Developed 'Living' Skin For Robots, And It's Quite Something


From Talos, the giant bronze automaton who guarded the princess Europa in ancient Greek myths, to Cylons and Terminators, the idea of artificial humans has both fascinated and creeped us out for centuries. Now, we're closer than ever to making a robot...




www.sciencealert.com


----------



## ButIHaveNoFear (Sep 6, 2017)

Darn it. I started writing snippets of a novel. ...a dirty novel. The robot lover's name is Cambio, and he's delightful company. His owner becomes very attached to him and doesn't feel like she's missing out on a human relationship. At least not yet. It might become a practice relationship that opens her eyes to a world of human possibilities. When a robot does all the cooking and chores and provides emotional support, like a 50s housewife, it opens up one's schedule to pursue new activities and see other people. It could go a couple of ways, and I'll probably work on it sporadically. Maybe it's like how royalty had arranged marriages, but they confided in other people.


----------



## recycled_lube_oil (Sep 30, 2021)

The future of Graves, if we can create chatbots to mimic the dead:










Now how do I patent stuff????


----------



## Hexigoon (Mar 12, 2018)

Not "sentient" but getting convincing.


If you want to hear the full conversation:


----------



## recycled_lube_oil (Sep 30, 2021)

Interesting that it looks like art will become the domain of AI. Last thing I expected to see as a possibility. But the ability to turn portraits into a photo realistic 3d model is kind of impressive.


----------



## recycled_lube_oil (Sep 30, 2021)

AI writes paper about itself for peer review.









Researcher Tells AI to Write a Paper About Itself, Then Submits It to Academic Journal


It looks like algorithms can write academic papers about themselves now, which begs the question — how long until human writers are obsolete?




futurism.com


----------



## DOGSOUP (Jan 29, 2016)

recycled_lube_oil said:


> AI writes paper about itself for peer review.
> 
> 
> 
> ...


I wonder if the fake-paper factories aren't already using that, or do they still have the good old "trained monkeys with typewriters"...?


----------

