Give a man a gun and he can rob a bank:
give a man a bank and he can rob the world.
@Bonoboism
Sunday, June 10, 2012
Monday, June 4, 2012
RoboEthics
We, as a species, are always telling ourselves 'stories' about how we relate to the rest of life on Earth. Most of these stories get rehashed and repackaged through the ages, so one is familiar with the stories even if one disagrees with them. One of the main stories that humans are telling themselves in the current epoch is: we are the bringers forth of the technology which facilitates evolution, the bringers forth of the Earth's technological advance. In order to do this we need to impart a certain amount of creative destruction - destroy in order to save. The Economist has this to say:
In the classic science-fiction film “2001”, the ship’s computer (HAL), faces a dilemma. His instructions require him both to fulfill the ship’s mission (investigating an artifact near Jupiter) and to keep the mission’s true purpose secret from the ship’s crew. To resolve the contradiction, he tries to kill the crew. As robots become more autonomous, the notion of computer-controlled machines facing ethical decisions is moving out of the realm of science fiction and into the real world. Society needs to find ways to ensure that they are better equipped to make moral judgments than HAL was.
Military technology, unsurprisingly, is at the forefront of the march towards self-determining machines. Its evolution is producing an extraordinary variety of species. There is a flying surveillance drone the weight of a wedding ring, and one that carries 2.7 tons of bombs. Robots are spreading in the civilian world, too, from the flight deck to the operating theatre. Passenger aircraft have long been able to land themselves. Driverless trains are commonplace. Fully self-driving vehicles are being tested around the world; Google’s driverless cars digitally map neighborhoods.
As they become smarter and more widespread, autonomous machines are bound to end up making life-or-death decisions in unpredictable situations, thus assuming - or at least appearing to assume - moral agency. Weapons systems currently have human operators “in the loop”; but as they grow more sophisticated, it will be possible to shift to “on the loop” operation with machines carrying out orders autonomously.
As that happens, they will be presented with ethical dilemmas. Should a drone fire on a house where a target is known to be hiding, which may also be sheltering civilians? Should a driverless car swerve to avoid pedestrians if that means hitting other vehicles or endangering its occupants? Should a robot involved in disaster recovery tell people the truth about what is happening if that risks causing a panic? Such questions have led to the emergence of the field of “machine ethics”, which aims to give machines the ability to make such choices appropriately - in other words, to tell right from wrong.
One way of dealing with these difficult questions is to avoid them altogether, by banning autonomous robots. But, autonomous robots could do much more good than harm. Instead, society needs to develop ways of dealing with the ethics of robotics - and get going fast. The best known set of guidelines for robo-ethics are the “three laws of robotics” coined by Isaac Asimov, a science-fiction writer. The laws require robots to protect humans, obey orders and preserve themselves - in that order. Unfortunately, the laws are of little use in the real world.
Regulating the development and use of autonomous robots will require a rather more elaborate framework. Progress is needed in three areas in particular: (i) In order to allocate responsibility, autonomous systems must keep detailed logs so that they can explain the reasoning behind their decisions when necessary, (ii) where ethical systems are embedded into robots, the judgments they make need to be ones that seem right to most people and (iii) more collaboration to draw up new rules for robots which are left to their own devices.
Both ethicists and engineers stand to benefit from working together: ethicists may gain a greater understanding of their field by trying to teach ethics to machines, and engineers need to reassure society that they are not taking any ethical short-cuts. Technology has driven mankind’s progress, but each new advance has posed troubling new questions. Autonomous machines are no different. The sooner the questions of moral agency they raise are answered, the easier it will be for mankind to enjoy the benefits that they will undoubtedly bring.
In the classic science-fiction film “2001”, the ship’s computer (HAL), faces a dilemma. His instructions require him both to fulfill the ship’s mission (investigating an artifact near Jupiter) and to keep the mission’s true purpose secret from the ship’s crew. To resolve the contradiction, he tries to kill the crew. As robots become more autonomous, the notion of computer-controlled machines facing ethical decisions is moving out of the realm of science fiction and into the real world. Society needs to find ways to ensure that they are better equipped to make moral judgments than HAL was.
Military technology, unsurprisingly, is at the forefront of the march towards self-determining machines. Its evolution is producing an extraordinary variety of species. There is a flying surveillance drone the weight of a wedding ring, and one that carries 2.7 tons of bombs. Robots are spreading in the civilian world, too, from the flight deck to the operating theatre. Passenger aircraft have long been able to land themselves. Driverless trains are commonplace. Fully self-driving vehicles are being tested around the world; Google’s driverless cars digitally map neighborhoods.
As they become smarter and more widespread, autonomous machines are bound to end up making life-or-death decisions in unpredictable situations, thus assuming - or at least appearing to assume - moral agency. Weapons systems currently have human operators “in the loop”; but as they grow more sophisticated, it will be possible to shift to “on the loop” operation with machines carrying out orders autonomously.
As that happens, they will be presented with ethical dilemmas. Should a drone fire on a house where a target is known to be hiding, which may also be sheltering civilians? Should a driverless car swerve to avoid pedestrians if that means hitting other vehicles or endangering its occupants? Should a robot involved in disaster recovery tell people the truth about what is happening if that risks causing a panic? Such questions have led to the emergence of the field of “machine ethics”, which aims to give machines the ability to make such choices appropriately - in other words, to tell right from wrong.
One way of dealing with these difficult questions is to avoid them altogether, by banning autonomous robots. But, autonomous robots could do much more good than harm. Instead, society needs to develop ways of dealing with the ethics of robotics - and get going fast. The best known set of guidelines for robo-ethics are the “three laws of robotics” coined by Isaac Asimov, a science-fiction writer. The laws require robots to protect humans, obey orders and preserve themselves - in that order. Unfortunately, the laws are of little use in the real world.
Regulating the development and use of autonomous robots will require a rather more elaborate framework. Progress is needed in three areas in particular: (i) In order to allocate responsibility, autonomous systems must keep detailed logs so that they can explain the reasoning behind their decisions when necessary, (ii) where ethical systems are embedded into robots, the judgments they make need to be ones that seem right to most people and (iii) more collaboration to draw up new rules for robots which are left to their own devices.
Both ethicists and engineers stand to benefit from working together: ethicists may gain a greater understanding of their field by trying to teach ethics to machines, and engineers need to reassure society that they are not taking any ethical short-cuts. Technology has driven mankind’s progress, but each new advance has posed troubling new questions. Autonomous machines are no different. The sooner the questions of moral agency they raise are answered, the easier it will be for mankind to enjoy the benefits that they will undoubtedly bring.
Wednesday, May 30, 2012
Speaking Body Language
Before a critical meeting with your boss, an important customer; or your teenage son or daughter, do you spend time mentally roughing out and revising what you are going to say? If so, you are misdirecting your energy. Spend time instead practicing how to walk, stand, sit and quickly grasp how other people are moving their bodies.
When it comes to research on power, there is plenty of evidence on the importance of body posture and tone of voice. That research indicates that it isn’t the quality of an argument that will persuade people. It is rather how the protagonist conveys it. Our status is determined by physical attributes and nonverbal cues. People decide if we are competent in less than 100 milliseconds.
Neither is intelligence a strong predictor of leadership. The pitch, volume and pace of your voice affect what people think you said about five times as much as the actual words you use. We are, however, impressed with our own arguments. So, it sometimes pays to repeat back to someone you are trying to impress, what he or she said.
In order to prepare for authority issues in the job market; MBA students now team up with drama teachers, who help them practice. Actors, it turns out, are exceptionally good at paying attention to other actors. The idea is that MBA graduates avoid getting into professional or personal trouble later in their careers, because they don’t know how to be deferential to other people when it is appropriate. Or, not knowing how to take charge when that is called for.
There are times, when you want to play 'low' status'; which means you are making the relationship work and not necessarily giving anything important away. 'Playing low' can lift others up and make them feel good about themselves. For 'high-status' people in an organization, telling a self deprecating joke can make you more approachable.
When it comes to research on power, there is plenty of evidence on the importance of body posture and tone of voice. That research indicates that it isn’t the quality of an argument that will persuade people. It is rather how the protagonist conveys it. Our status is determined by physical attributes and nonverbal cues. People decide if we are competent in less than 100 milliseconds.
Neither is intelligence a strong predictor of leadership. The pitch, volume and pace of your voice affect what people think you said about five times as much as the actual words you use. We are, however, impressed with our own arguments. So, it sometimes pays to repeat back to someone you are trying to impress, what he or she said.
In order to prepare for authority issues in the job market; MBA students now team up with drama teachers, who help them practice. Actors, it turns out, are exceptionally good at paying attention to other actors. The idea is that MBA graduates avoid getting into professional or personal trouble later in their careers, because they don’t know how to be deferential to other people when it is appropriate. Or, not knowing how to take charge when that is called for.
There are times, when you want to play 'low' status'; which means you are making the relationship work and not necessarily giving anything important away. 'Playing low' can lift others up and make them feel good about themselves. For 'high-status' people in an organization, telling a self deprecating joke can make you more approachable.
Many of us know the boss who says: “My door is always open,” but their body language adds “but really, don’t come in here.” In facing a subordinate who doesn’t know he or she is a subordinate, a few extra moments of silence can send the right signal.
Your posture affects you as well as other people. Try this: sit down, press your knees together, hold your elbows close to the sides and lean forward. While in that position say: "I am totally in charge." You will probably find it incongruous and won't believe your own words. But, if you were to lower your shoulders, drape an outstretched arm over the back of a chair and spread your legs wide; taking up more space - now that would feel different.
Your posture affects you as well as other people. Try this: sit down, press your knees together, hold your elbows close to the sides and lean forward. While in that position say: "I am totally in charge." You will probably find it incongruous and won't believe your own words. But, if you were to lower your shoulders, drape an outstretched arm over the back of a chair and spread your legs wide; taking up more space - now that would feel different.
High-status people generally let their bodies take more space than low-status people. That alone makes them both appear and feel relaxed. Like others, you probably know this on your gut level. There is a body language of power, and we know it. But, we don't know we know it.
Thursday, May 10, 2012
Playing With Perception
All of us want to affect the world for the better. We look around us and see problems - crime, pollution, child abuse - and because we're a society of rules, we believe that laws and religious commandments will help us make changes. For example, we elect legislators who pass more and more laws every year in the hopes that these rules will make citizens' lives better. By contrast, the ancient Greeks were a people of the concept - they knew that there was nothing as powerful as an idea whose time has come. Because they manipulated ideas so elegantly, they were able to invent democracy, develop philosophy and systematize mathematics.
Their Roman neighbors, on the other hand, were great lawmakers; and Roman codes have influenced many modern Western laws. When faced with problems, the Greek philosophers conceptualized new systems; while the Romans called on their armies to enforce the precepts. The wisdom practitioners of the indigenous world, however, don't live by rules or ideas. If they want to change their world, they don't pass new laws or come up with new theories. Instead, they choose to change the way they perceive a problem. By changing their perception, they transform a challenge into an opportunity.
It enables them to change their perceptions and 'dream' their world into being; experiencing events in such a way, that they no longer take life personally. At this level of perception, things no longer happen to you; they simply happen. It doesn't rain on you to make you wet; it simply rains. When you change your perception of the events you experience, you also alter the way these situations live within you. You are no longer the cause or the effect of anything, and you sense a tremendous relief because the world is exactly as it should be - and it doesn't need you to fix it.
In the West, we tend to associate our perception with the dozens of states of awareness we're familiar with. For example, we're in one mode of awareness when we're just waking up or drifting off to sleep, another when we're in reverie; another when we're enraged, and so on. In each one, a different part of the brain is active - so we refer to them as 'states of consciousness', which are products of the mind. Perceptual levels, on the other hand, exist independently of the mind.
There are four perceptual levels in the indigenous world. These levels correspond to the four domains of manifestation: the physical world (our body), the realm of thoughts and ideas (mind), the realm of myth (soul) and the world of spirit (energy). These perceptual levels are associated with the four energetic bodies that make up the human energy field. They're stacked inside each other like Russian nesting dolls, with the physical body innermost, the mental body enveloping and informing the physical shell; the soul enveloping the mental and physical, and the spiritual body outermost, informing and organizing them all like a blueprint.
When we shift from one level of perception up to the next, we retain our ability to function at the lower realm; but we have a much wider view of what we're experiencing. Its like the old story about a traveler who comes across two stonecutters. He asks the first, "What are you doing?" and receives the reply, "Squaring the stone." He then walks over to the second stonecutter and asks, "What are you doing?" and receives the reply, "I am building a cathedral." In other words, both men are performing the same task, but one of them is aware that he has the choice to be part of a greater dream.
Albert Einstein once said that, the problems we face in life cannot be solved at the level where they were created. To that end, being able to shift to a higher realm of perception can help us find solutions to our problems; whereas before we were only experiencing distress and separation. In this way, we learn that there's a solution to every problem we encounter in the physical world; in the mind and in the soul. We learn that we can't eliminate scarcity in our lives by getting another job, or heal feelings of abandonment or anger by understanding our childhood wounds. We can only fix these problems at the level above the one they were created in.
Subscribe to:
Posts (Atom)