If only life were always so easy. We can hope, however, that this time it will be. For everyone's sake.
( Perhaps in his when, it would be handled with far more discretion. If androids were already in some place of power instead of being "just machines", they would have an easier time obtaining rights to function, to own property, to hold jobs. Being treated as property instead of citizens would always be a struggle, she feels, but it was an unfair one.
Why give something the ability to become someone, an ever-evolving sense of right and wrong and the ability to judge situations at hand, and then deny them the right to be someone greater? )
I only hope it hasn't been too large a bite to swallow; thank you for your patience with me. :) As for the question on sentience, as I said, the A.I. was originally developed to help process safety situations at hand in order to be able to cease functions if someone was in danger or something went wrong with the machinery. This same delegation was given to all Omniums, I imagine. The household ones in particular— if they functioned as nannies or housekeepers, they needed to know when the family they were working for were in danger. If they were escorting children home from school and they leapt into the road to play, and other such scenarios.
I was quite young at the time, however. So this is only what I have heard over a decade later. I don't believe "fairness" was intentional— it was most likely a branch-off from situational judgment. The topic of "judgment" can mean both one's assessment of physical conditions to make a decision, as well as one's assessment in deciding another's actions or fate. If a program were given the ability to do its own research, it would be difficult to separate the two in initial searches.
Does it sound familiar, to you?
( She's wondering why that aspect in particular caught his attention, is all. )
the day I stop losing tags is the day I just perish I guess, I'm sorry!
( Perhaps in his when, it would be handled with far more discretion. If androids were already in some place of power instead of being "just machines", they would have an easier time obtaining rights to function, to own property, to hold jobs. Being treated as property instead of citizens would always be a struggle, she feels, but it was an unfair one.
Why give something the ability to become someone, an ever-evolving sense of right and wrong and the ability to judge situations at hand, and then deny them the right to be someone greater? )
I only hope it hasn't been too large a bite to swallow; thank you for your patience with me. :)
As for the question on sentience, as I said, the A.I. was originally developed to help process safety situations at hand in order to be able to cease functions if someone was in danger or something went wrong with the machinery. This same delegation was given to all Omniums, I imagine. The household ones in particular— if they functioned as nannies or housekeepers, they needed to know when the family they were working for were in danger. If they were escorting children home from school and they leapt into the road to play, and other such scenarios.
I was quite young at the time, however. So this is only what I have heard over a decade later. I don't believe "fairness" was intentional— it was most likely a branch-off from situational judgment. The topic of "judgment" can mean both one's assessment of physical conditions to make a decision, as well as one's assessment in deciding another's actions or fate. If a program were given the ability to do its own research, it would be difficult to separate the two in initial searches.
Does it sound familiar, to you?
( She's wondering why that aspect in particular caught his attention, is all. )