Topic: Non Human Intelligence #1  (Read 3048 times)

0 Members and 1 Guest are viewing this topic.

Offline Bonk

  • Commodore
  • *
  • Posts: 13298
  • You don't have to live like a refugee.
Non Human Intelligence #1
« on: October 31, 2010, 10:12:09 am »
Brass balls...

BerliosProject:NHI1

Quote
The line from the evolution in computer-science up to the computer hardware development speed and the internet global computer network lead to the possibility to create the first Non Human Intelligence until 2040. The question is not if this will happen, the question is only if this intelligence will be open (controlled by the entire mankind) or will be closed (controlled by a single company or by a single government).
The NHI1 project will create an open intelligence.


It and the attached moggel concept I find extremely interesting. I love the nobility of the goal. I agree whole heartedly that the first non-human (and human created) intelligence should most certainly be open.

I can't tell how serious this is. I wonder if it will take off?


Offline marstone

  • Because I can
  • Commander
  • *
  • Posts: 3014
  • Gender: Male
  • G.E.C.K. - The best kit to have
    • Ramblings on the Q3, blog
Re: Non Human Intelligence #1
« Reply #1 on: October 31, 2010, 10:53:31 am »
Brass balls...

BerliosProject:NHI1

Quote
The line from the evolution in computer-science up to the computer hardware development speed and the internet global computer network lead to the possibility to create the first Non Human Intelligence until 2040. The question is not if this will happen, the question is only if this intelligence will be open (controlled by the entire mankind) or will be closed (controlled by a single company or by a single government).
The NHI1 project will create an open intelligence.


It and the attached moggel concept I find extremely interesting. I love the nobility of the goal. I agree whole heartedly that the first non-human (and human created) intelligence should most certainly be open.

I can't tell how serious this is. I wonder if it will take off?


Yes, open.  Thus I can inject "Evil" into it.  Mwa-ha-ha.

But yeah, another thing to watch.
The smell of printer ink in the morning,
Tis the smell of programming.

Offline Nemesis

  • Captain Kayn
  • Global Moderator
  • Commodore
  • *
  • Posts: 13067
Re: Non Human Intelligence #1
« Reply #2 on: October 31, 2010, 03:34:45 pm »
A computer based intelligence would be (IMO) radically different from a natural intelligence due to both different senses and not evolving in a predator/prey environment.  It could be difficult to communicate with due to a lack of common experience.  Given the capability to control things outside itself it could be dangerous without even recognizing the existence of humans. 

For a novel that discusses this very concept read J.P. Hogans The Two Faces of Tomorrow. 

Why would anyone be considered to own this (hypothetical) intelligence?  Would that not make it a slave?  Do we really want artificial slaves?
Do unto others as Frey has done unto you.
Seti Team    Free Software
I believe truth and principle do matter. If you have to sacrifice them to get the results you want, then the results aren't worth it.
 FoaS_XC : "Take great pains to distinguish a criticism vs. an attack. A person reading a post should never be able to confuse the two."

Offline Bonk

  • Commodore
  • *
  • Posts: 13298
  • You don't have to live like a refugee.
Re: Non Human Intelligence #1
« Reply #3 on: October 31, 2010, 03:56:55 pm »
I think it has no "choice" but to be a natural intelligence. As it will be our creation. (semantics, I know.)

Offline Bonk

  • Commodore
  • *
  • Posts: 13298
  • You don't have to live like a refugee.
Re: Non Human Intelligence #1
« Reply #4 on: November 01, 2010, 04:35:14 am »
Wait a minute... no predator/prey interactions. I see the issue. No fear.

Biomimetics and natural models are beginning to advance our tech in serious ways. Why not apply that here? Perhaps there cannot be NHI1 but only NHI species 1-10 generation 1.

This is some time off, perhaps 2040 is realistic though, what he has here is a start on language abstraction and dynamic operating system mods. A logical start. A project of such magnitude should have a longer planning phase perhaps though? But I do appreciate the dive in and just get it done approach too.

And it of course should not be called non-human intelligence. The planet is already choc full of that in abundance. Machine Intelligence One is a better handle and gives it an interesting acronym/nickname: MIO.

edit: but surely it would fear its creators no? As long as dependent... 3 laws debate...
« Last Edit: November 01, 2010, 05:17:50 am by Bonk »

Offline Nemesis

  • Captain Kayn
  • Global Moderator
  • Commodore
  • *
  • Posts: 13067
Re: Non Human Intelligence #1
« Reply #5 on: November 01, 2010, 09:53:35 am »
I think it has no "choice" but to be a natural intelligence. As it will be our creation. (semantics, I know.)

It wouldn't have the common development with us that would give it a basis to understand us. 

What instincts would it have?  Its neural system would be different at a fundamental level (it would have to be as the way ours works is still only crudely understood) this would affect the way it thinks.  As already mentioned its senses would be different and therefore (literally) it would have a different view of the world. 

Wait a minute... no predator/prey interactions. I see the issue. No fear.

Much more.  No basis for the emotions we have.  What emotions would it have?  Ethics?  Morality?  Our intellect is built up in layers on older substrates, what would an intellect built of only our top layers be like?

Could it recognize US as intelligent let alone as its creator?  What if it saw us as a threat or just inconvenient an obstacle?

Should an immortal (potentially) intelligence be created then kept caged up or turned on and off at the whims of its creators?  Is that moral or ethical?  Is it right to create such a being then "edit it" for our whims or needs or to experiment on?  Slave minds, are they any more right than human slaves?
Do unto others as Frey has done unto you.
Seti Team    Free Software
I believe truth and principle do matter. If you have to sacrifice them to get the results you want, then the results aren't worth it.
 FoaS_XC : "Take great pains to distinguish a criticism vs. an attack. A person reading a post should never be able to confuse the two."

Offline Bonk

  • Commodore
  • *
  • Posts: 13298
  • You don't have to live like a refugee.
Re: Non Human Intelligence #1
« Reply #6 on: November 01, 2010, 10:32:11 am »
Well that is just it. As I see it, a machine intelligence is most likely to be achieved by building a "framework" similar to the one we dropped out of.  We don't write the code for the intelligence directly, it has to do that itself. Any other approach is going to end up a mess.

Offline Nemesis

  • Captain Kayn
  • Global Moderator
  • Commodore
  • *
  • Posts: 13067
Re: Non Human Intelligence #1
« Reply #7 on: November 01, 2010, 10:55:20 am »
Well that is just it. As I see it, a machine intelligence is most likely to be achieved by building a "framework" similar to the one we dropped out of.  We don't write the code for the intelligence directly, it has to do that itself. Any other approach is going to end up a mess.

Then it is a long way off as I haven't heard anything that makes me believe we are even close on understanding that framework let alone building one.

In essence you want to build the fundamental network then let it evolve itself.  (My understanding of your intentions let me know if I am wrong).  Even with a similar framework the environment of the evolution is totally different and still would seem to me to lead to the issues mentioned earlier. 

Just think of the variety of different ways people think from "Joe Average" to Einstein, Saints to monsters.  There would be no guarantee of a benevolent creation rather than a monster, indeed no guarantee of sanity as we know it. 

Human intellect is modified by the underlying biology, a AI wouldn't have that.  How would the lack of sleep affect it?  Would it suffer from sensory deprivation (or overload)?  Without the biology could it experience emotions as we know them?  What would a truly emotionless being be like?  What decisions would it make?

Still the whole concept of artificial slave minds kept imprisoned for scientific research bothers me.  At the same time letting such alien minds roam free without knowing that they will stay benevolent is scary too. 
Do unto others as Frey has done unto you.
Seti Team    Free Software
I believe truth and principle do matter. If you have to sacrifice them to get the results you want, then the results aren't worth it.
 FoaS_XC : "Take great pains to distinguish a criticism vs. an attack. A person reading a post should never be able to confuse the two."

Offline Bonk

  • Commodore
  • *
  • Posts: 13298
  • You don't have to live like a refugee.
Re: Non Human Intelligence #1
« Reply #8 on: November 05, 2010, 09:06:23 am »
I don't think we are actually that far off. (You've got the right idea in general)

Your series of questions applies to natural creatures as much as our AI machine engineered intelligence child(ren).

It is as likely to need "sleep" as you or I. (hold the robotic sheep jokes ;))

I doubt that a "digital"/engineered (I keep doing this as the line between the bio-organic and machine is getting blurry) intelligence could arise without emotion.

You raise an interesting question with the concept of imprisonment or enslavement... I can envision a scenario where a researcher watches the moment happen, achieves the dream... and with the ensuing horror that dawns on his consciousness commits the first and only murder he ever commits ... putting it out of its instant misery (story has probably been written I imagine - refrains of "Daisy" echo in my ears...)

However, I have this destiny thing... it is our best way to survive and grow long term. The next evolutionary step, a recursive one, a "quantum leap" if you will. We already do it in so many ways, we need to get over our squeamishness about it and get on with it. Whether the medium we choose is organic or inorganic or some combination thereof.

Of course, the other side of me says we should all ride horses, check our uncontrolled population growth and live in peaceful agrarian bliss until the sun eats us. ;) (and make lots of nice wooden furniture)

edit: on further rumination, we near the point that it could happen "in the wild". That concept is spooky enough to say that we should be active in controlling the intelligences that evolve in the digital ecosystem. I can see some script kiddie some day using the right combination of "neural net" libraries and his own brilliant and inspired code (I have learned it is the kids...) and "accidentally" releases an intelligent worm upon the world. I don't have to suspend very much disbelief... Would we even notice? A really smart one would hide for quite some time... Heh, I bet there are already systems that "sniff" for it. We need to build an Overseer; one to control the ecosystem. (one ring to rule them and in the darkness... etc ;) aka Agent Smith.)

Wow, I should license a new AV product: Agent SmithTM;D
« Last Edit: November 05, 2010, 09:38:25 am by Bonk »

Offline Nemesis

  • Captain Kayn
  • Global Moderator
  • Commodore
  • *
  • Posts: 13067
Re: Non Human Intelligence #1
« Reply #9 on: November 06, 2010, 09:22:52 pm »
Your series of questions applies to natural creatures as much as our AI machine engineered intelligence child(ren).


Not as much as they share much of the same underpinnings of their intelligence as we do.  The same evolution and biology shape it.  The core is the same only some layers would be modified.

When building an AI (or evolving one) we don't understand our own thinking or biology well enough to replicate it.

I doubt that a "digital"/engineered (I keep doing this as the line between the bio-organic and machine is getting blurry) intelligence could arise without emotion.


But would it be an emotion that we would recognize or understand?  It could be a totally new and unpredictable emotion.

You raise an interesting question with the concept of imprisonment or enslavement... I can envision a scenario where a researcher watches the moment happen, achieves the dream... and with the ensuing horror that dawns on his consciousness commits the first and only murder he ever commits ... putting it out of its instant misery (story has probably been written I imagine - refrains of "Daisy" echo in my ears...)


I wonder how long it would take for the concept to even enter the researchers mind and when it did would his own (or the corporate heads) greed interfere with the more ethical thing.  Imagine "I just created the first human level AI, I'm a shoo in for the Nobel prize" thought followed by the ethical ramifications, which does he choose?  Fame and money (justified by someone will do it so my not doing it has no effect) or ethics and morality?

However, I have this destiny thing... it is our best way to survive and grow long term. The next evolutionary step, a recursive one, a "quantum leap" if you will. We already do it in so many ways, we need to get over our squeamishness about it and get on with it. Whether the medium we choose is organic or inorganic or some combination thereof.


Instead look at tech to link us to machines that enhance our abilities, machines that need the human element to function.

Of course, the other side of me says we should all ride horses, check our uncontrolled population growth and live in peaceful agrarian bliss until the sun eats us. ;) (and make lots of nice wooden furniture)


Even if we chose to go that way there is no reason it has to be pure agrarian.  There could as easily be hi tech underpinnings.  Ride horses and transfer to hi tech when you need long distance.  Use satellite communications rather than wired.  Tech designed to last (last I read 2/3 of all deLoreans were still on the road) rather than planned obsolescence.  Many ways to maintain tech and reduce our foot print on the planet.

edit: on further rumination, we near the point that it could happen "in the wild". That concept is spooky enough to say that we should be active in controlling the intelligences that evolve in the digital ecosystem. I can see some script kiddie some day using the right combination of "neural net" libraries and his own brilliant and inspired code (I have learned it is the kids...) and "accidentally" releases an intelligent worm upon the world. I don't have to suspend very much disbelief... Would we even notice? A really smart one would hide for quite some time... Heh, I bet there are already systems that "sniff" for it. We need to build an Overseer; one to control the ecosystem. (one ring to rule them and in the darkness... etc ;) aka Agent Smith.)

Wow, I should license a new AV product: Agent SmithTM;D


Have you read (about 1977) The Adolescence of P1?  A hacker creates a program that learns how to hack control of other systems and sets it loose, when he tries to shut it down it cuts him off and goes alone.  Twenty years later his employers mainframe shuts down and only prints the message "Call Gregory" as it has grown and gone looking for its creator.  (no more spoilers).
Do unto others as Frey has done unto you.
Seti Team    Free Software
I believe truth and principle do matter. If you have to sacrifice them to get the results you want, then the results aren't worth it.
 FoaS_XC : "Take great pains to distinguish a criticism vs. an attack. A person reading a post should never be able to confuse the two."