Artificial Intelligence - The Rise of Machines!

Discussion in 'Off-Topic' started by b.shahvir, Apr 28, 2009.

  1. b.shahvir

    Thread Starter Active Member

    Jan 6, 2009
    444
    0
    Hi all, :)

    Technology has advanced phenomenally in the past few decades or so. A particular point of concern…….yeah that’s right!, ‘concern’ is the massive advent in Artificial Intelligence (AI) technology. Softwares so powerful that it will decide the fate of mankind! Maybe, the justice system might get completely replaced by virtual judges who will decide the fate of criminals….or virtual doctors who shall prescribe what medications we need! :rolleyes:

    Imagine if weapons were to have a mind of their own….and what if they turn against us? Yeah right, one more fan of the Terminator franchise, y’all did say! but nevertheless, this could be a possibility in the near future and I need not prove my point, it’s there for all to see.

    So, my point is, should technologists take moral responsibility for our future and draw a line somewhere? Will there be no limit to the advent of AI technology? Do we, as human beings, allow technology to control our lives in the future? Will there be any future left? A host of unanswered questions! But it’s high time scientists and technocrats introspect about the repercussions of science and technology…..the negative effects in particular!

    P.S. Virtual politicians anyone? :D
     
  2. Nanophotonics

    Active Member

    Apr 2, 2009
    365
    3
    It's funny as I read the post I recalled a similar one I created on facebook but my smart friends never had the time to bother about it. :D

    J-Day. Yes I'm a fan too and it's a possibility that I've been thinking of, especially again after the recent interaction with ALICE.

    Thanks.
     
  3. Developer_Dan()

    Member

    Oct 8, 2007
    17
    0
    Hi everyone,

    Wow been a while since I logged in to AllAboutCircuits.
    Just wanted to comment and have my 2cents worth on this topic.

    As I am a passionate programmer, trying to better understand electronics engineering so I can further complete myself, I think about this topic all the time.

    I always tell people about movies like Terminator, RoboCop and iRobot.
    They're honestly are not as 'stupid' as some might believe at first glance!!
    It will happen!! Not every detail necessarily, but basically it will happen and it's is unstoppable......! (Which I personally have no problem with ;)).

    Isaac Asimov wrote the 3 laws of robotics (these laws are in iRobot).
    This is a way that allows ALL humans to feel safe from robots, whether it be the hardware and/or software of the machines.
    Rules must be programmed in, studied, confirmed, agreed upon, changed, revised and hardwired in, that would ALWAYS allow the HUMAN to be in either control and/or at safety from a machine, to say the least.

    Besides small more primitive projects, it would obviously have to take an evil company of engineers, scientists, physists and software developers, to mastermine any real power AI machines that did not follow these kinds of rules.

    In a sense, these rules work as a protection mechanism, like Rings in a CPU.

    Until next time guys happy tinkering!!
     
  4. Wendy

    Moderator

    Mar 24, 2008
    20,764
    2,531
    Truth, when it is time it will happen, unless you propose a Frank Hubert (remember Dune?) scenario and have a Jihad against them. Too many people, too many governments.

    Personally the day we create sentience I hope we're smart enough to give them rights, because if we don't they may feel they have to take them. I suspect we won't be smart enough to make them happy about being slaves, especially the way people will want to abuse them.
     
  5. steveb

    Senior Member

    Jul 3, 2008
    2,433
    469
    It does not necessarily require evil people. I could foresee a government developing weapons technology in this way. Yes, safety systems would be in place, but once a weapon is activated, it becomes dangerous to someone. Also, the more complex the system, the higher the probability of bugs/malfunctions. These two ingredients are key to a recipe for disaster.

    People have quoted some sci-fi stories. I'm reminded of the episode "The Doomsday Machine" from the original Star Trek series.

    Some might call a government sponsored group of scientists/engineers "evil", but, to me, they would likely be well-intentioned people trying to protect their country and make a living at the same time. Even Einstein, who was a pacifist, eventually sent a letter to the US president pointing out that if the US did not develop the A-bomb first, Hitler would succeed in dominating the world. One way or another, you can't stop human nature: it always leaks through any barrier. Consider the wisdom in Robert Frost's famous poem:

    The Flood

    Blood has been harder to dam back than water.
    Just when we think we have it impounded safe
    Behind new barrier walls (and let it chafe!),
    It breaks away in some new kind of slaughter.

    We choose to say it is let loose by the devil;
    But power of blood itself releases blood.
    It goes by might of being such a flood
    Held high at so unnatural a level.

    It will have outlet, brave and not so brave.
    weapons of war and implements of peace
    Are but the points at which it finds release.
    And now it is once more the tidal wave

    That when it has swept by leaves summits stained.
    Oh, blood will out. It cannot be contained.
     
  6. Nanophotonics

    Active Member

    Apr 2, 2009
    365
    3
    That's the sad part.
     
  7. KL7AJ

    AAC Fanatic!

    Nov 4, 2008
    2,039
    287
    Artificial Intelligence will never be a match for authentic stupidity. :)
     
  8. b.shahvir

    Thread Starter Active Member

    Jan 6, 2009
    444
    0
    You bet!! :D
     
  9. AlexR

    Well-Known Member

    Jan 16, 2008
    735
    54
    Given the levels of normal intelligence in the general population, a bit of extra artificial intelligence might not be such a bad idea.
     
  10. Nanophotonics

    Active Member

    Apr 2, 2009
    365
    3
    Haha :D Nice one!
     
  11. loosewire

    AAC Fanatic!

    Apr 25, 2008
    1,584
    435
    In some countries there are still tribal leaders,we need to go back to
    local control, get off national power grids,they only exist for greed.
    Local schools. You work all week to enjoy peace and quiet on weekend.
    We are going to be forced to stay within our means.Robots will be
    subject to virises that can turn against you. Our power grid is a robot,
    Look what happen to I.T. stocks,how many times do local people have
    to lose there saving to learn. Follow the leaders.
     
  12. Nanophotonics

    Active Member

    Apr 2, 2009
    365
    3
    Very true. I agree. In the very end, it's all about "money/power/control" for those who shall mass-produce robots. Only the engineer shall genuinely appreciate his/her intelligent design.

    Thanks.
     
    Last edited: Apr 29, 2009
  13. loosewire

    AAC Fanatic!

    Apr 25, 2008
    1,584
    435
    Engineers are being directed to make new fast communications devises.
    Seem to be the real market now,but are stock holders making any money.
    The big companies build and lose profits on paper.
     
  14. b.shahvir

    Thread Starter Active Member

    Jan 6, 2009
    444
    0
    Make that levels of intelligence of our politicians! :D
     
  15. AlexR

    Well-Known Member

    Jan 16, 2008
    735
    54
    Unless you happen to live in a dictatorship, the politicians get elected by the general population so if your "elected representative" is somewhat lacking in the intelligence department what does that say about the people who voted him into power?
     
  16. b.shahvir

    Thread Starter Active Member

    Jan 6, 2009
    444
    0
    BINGO!!! ;)
     
  17. thatoneguy

    AAC Fanatic!

    Feb 19, 2009
    6,357
    718
    Some of them were in the graveyard while 'voting'. ;) Most others were simply doing what the TV told them to.

    On topic, I thought back through Heinlein books, AC/DC's "Maximum Overdrive", Dune, and of course, Marvin the Paranoid Android.

    Other than Marvin, the important concept absent is "feelings". If a machine were capable of both moods, intelligence, and felt no pain, it would be devastating.

    Luckily, AI so far has only been able to mimic the best creativity of humans, some using a huge database of "Human responses", as in Chess software. I don't know of any that will behave differently based on past treatment that was not already present in the design.

    You'll know it's time to kill power when all your email gets deleted if you beat the computer at a game.
     
  18. b.shahvir

    Thread Starter Active Member

    Jan 6, 2009
    444
    0
    Good point.....but hopefully the computer should let you! ;)
     
  19. leftyretro

    Active Member

    Nov 25, 2008
    394
    2
    "I'm sorry Dave, I'm afraid I can't let you do that" :cool:

    Actually I was rooting for HAL the whole time;)

    Lefty
     
    Last edited: May 3, 2009
  20. Developer_Dan()

    Member

    Oct 8, 2007
    17
    0
    lol............
     
Loading...