Feeds:
Posts
Comments

Student_BlogOver the last two decade, the field of genomics has advanced tremendously due to the improvement of DNA sequencing technology. The first DNA sequence took fifteen years and three billion dollars to produce, while nowadays a sequence can be obtained in four days for less than $1000. Traditionally, medical research has aimed to provide “one size fits all” drugs that are able to treat any person. However, complex diseases such as cancer have shown that such methods can have significant downsides, and that personalized treatments are needed. Decreasing sequencing costs brings the ideal vision of personalized medicine closer than ever before, as doctors will be able to use the DNA sequences of their patients to provide more specific treatments. However, as these technologies become more widely used in research and medicine, many ethical questions will arise about their use. The following is a discussion of two of these questions: should research participants receive their sequencing results? And should their relatives be involved?

 

Should research participants receive sequencing results?

When the three billion base-pair human DNA is sequenced, the results are produced in the form of large text files that are incomprehensible without analysis. Within these results, sensitive information about the subject’s normal genomic variation as well as disease risk factors are contained. For sequencing results to be useful, proper analysis in a laboratory setting would be required; however, this process is expensive and subject to misinterpretation even with scientific training. In the case of a participant with a deterministic mutation, a mutation that will definitely cause a disease, such as Huntington’s disease, the answer may be clear: to disclose the results if requested. However, DNA mutations are rarely deterministic. More often than not, multiple gene mutations along with certain environmental factors are required to cause a certain disease. This view is not presented by the media whenever genomic discoveries are reported, where it is often stated that scientists have discovered a gene “for” intelligence, obesity, diabetes…etc. Thus, presenting these results to research participants may not be appropriate, given the lack of certainty for the majority of the data. Such results could also have serious long lasting effects on a participant’s or patient’s quality of life. If a patient’s DNA has risk factors that are associated with cancer, his or her doctor may recommend MRI, CT, and other types of scans every six months. To the patient, this may seem like a definite truth: that he or she will definitely get cancer at some point in the future, and the only possible choice is more screening to prevent it. However, a person’s DNA can contain multiple cancer risk factors without causing any sort of tumors. For deterministic mutations, a person’s quality of life can be effected simply by knowledge of their existence. If a person had the mutation for Huntington’s disease, decisions about every aspect of his or her life may be dramatically influenced based on that knowledge. When his genome was sequenced, James Watson, who is widely considered the father of molecular biology and one of the discoverers of the DNA structure, chose not to learn whether he is at risk of Alzheimer’s or other late onset neurodegenerative disorders.

As the costs of sequencing decreases, its commercial use will also spread. One of the earliest companies to provide sequencing to the public was 23andMe. Just by providing saliva samples, consumers were able to obtain their DNA sequence, annotated with explanations only at functionally known areas of the genome. This service was stopped by the FDA in 2013, claiming that 23andMe’s annotation was not validated properly. This is an extension of the same problem discussed above, even with analysis from 23andMe, their results were not appropriately conveyed to the consumers. This is drastically different from medical tests, where errors of interpretations are much rarer. Regardless of the accuracy of their results, 23andMe required consumers to sign a consent form that allows for the anonymous disclosure of their DNA sequences to research databases.

 

Should relatives of participants be involved?

Due to the inherited nature of DNA, relatives of research participants can be effected by sequencing. Should they then be involved in the research process? Currently, relatives are not considered to be research subjects of studies and their consent is not requested. However, while relatives cannot be personally identified by the participant’s DNA, many of their physical characteristics and inherited diseases can be predicted. It may be the case that the researchers who sequence that DNA only use the results for their intended purposes. However, sequences can be used in the future to measure characteristics which neither the participant nor relatives agreed to, such as intelligence or personality traits. Currently any DNA sequence that is used for research purposes is anonymously incorporated in online databases. But as the field of genetics advances, predicting relatives’ DNA sequence and identifying them through the participant’s DNA may become possible, stripping the anonymity from those sequences. This can be exploited by employers as well as insurance companies, which can raise ethical and legal issues for the relatives as well as the participant. For example: insurance companies may force those with deterministic mutations to pay excessively high premiums, knowing that they will be inflicted with a disease at some point in their lives.

DNA sequencing is the most revolutionary medical tool since the invention of antibiotics, and its effects will be much more far reaching. Questions about privacy, informed consent, and sharing of DNA sequencing results will continue to arise, as these aspects directly affect participants or patients and indirectly effect their relatives. Moreover, this technology will heavily influence preventative medical care by promoting more testing for those with disease risk factors and decreasing testing for those without. Current sequencing techniques produce massive amounts of data, of which only a fraction is understood. Regardless of the context, clinical or research, the solutions to the ethical problems that arise will be of great importance in setting standards for the future use of this technology.


Student_BlogThere are famous architects whose names probably well‐known even to people who are far from architectural profession: Zaha Hadid, Norman Foster, Frank Gehry, Santiago Calatrava, Renzo Piano, Jean Nouvel and quite few others. They have received Pritzker prizes and their creations are distributed around the world on postcards and tourists brochures. After all, buildings that we construct now and that will stand for years to come will represent our times to future generations, right? Maybe it is true and nothing else matters except a masterpiece that left for centuries to admire, argue about, or even hate but never remain neutral about it.

That is right that it is much easier to criticize one’s work than to create your own. It is also true that great minds usually face negativism and jealousy so we should be grateful for those who in spite of criticism and misunderstanding create if not beautiful but definitely unusual architecture that we call now – signature architecture.

I do not want to discuss either uniqueness or ugliness or beauty of starchitects’ creations but rather to look at what is behind it and how design affects lives before during and after it becomes a structure. Several publications in the Architectural Record caught my attention recently. There is a certain trend between architects that can be summarized in one phrase: ‘I gave you a beautiful design of a structure or building – deal with it. It is not my responsibility how you build, maintain and use it’. I ask myself: Should an architect think or at least take into considerations complexity of the building process and what costs – both financial and human – it will request? Should he or she care how it was erected and what the process required?

“It’s not my duty as an architect to look at it,” Zaha Hadid told The Guardian in February of 2014. (http://www.vanityfair.com/online/daily/2014/08/zaha‐hadid‐worker‐conditions‐lawsuit). She said it after was asked about her opinion on the fact that more than 500 Indian and 382 Nepalese workers had died in the preparations for the 2022 FIFA World Cup in Qatar during last two years. Zaha Hadid designed a stadium for the event where some of those people might work but according to her, she has “… nothing to do with the workers”. Is that right? It is true that the client chooses a construction company who is responsible to hire workers and takes care of the whole construction process but an architect’s responsibility is not over after a building is designed. Architects have to maintain proper site and construction work observation and schedule to insure that the work is done properly and to approve necessary adjustments. Especially in cases when a project is done by a famous architect who has enormous weight in decision making process. A client, who paid a very high fee to a starchitect still wants to save money and have work done within a timeframe and budget, which leads to hiring workers from poor and low social classes and very often immigrants from neighboring less developed countries. Construction companies prefer to hire larger amount of workers, which is cheaper than improving construction technological processes. An architect is involved in the construction company selection process and can make a difference if he or she wants it, which means an architect “has something to do with the workers”. And especially prominent and accomplished architects. As a co‐founder of Who Builds Your Architecture? Group, Mabel O. Wilson, says: “Zaha does have some leverage, precisely because she is a highly visible person.” (http://archrecord.construction.com/features/2014/1406‐architecture‐and‐labor.asp)

Another characteristic of a “signature architecture” is a level of complexity of the building structure, schedule requirements, and challenging materials. Difficult and very demanding assembly of complex structures poses high risks of mistakes during construction of such structures and needs very professional and trained personnel not only at managerial level but all levels of workforce involved in the process. Labor conditions and housing of construction workers are directly connected to workers health conditions and should be in accordance with complexity of work they have to accomplish within a tight timeframe.

There are promising efforts in architectural community to raise awareness and ethical concerns in the profession. One example of that is a group in New York called Who Builds Your Architecture? Another one is a situation in Qatar being now monitored by the United Nations Human Rights Council. Frank Gehry’s efforts in Abu Dhabi on the construction site of his extension of the Guggenheim museum is yet another great example how changes are possible when there is willingness to make them. Gehry’s firm is working together with local officials to improve the situation there. (http://archrecord.construction.com/news/2014/09/140922‐Frank‐Gehry‐Works‐to‐Improve‐Worker‐Conditions‐on‐Abu‐Dhabi‐Site.asp) Those actions demonstrate that voices can be heard and can make a difference not only in construction industry but in society by creating a ripple effect of enabling social and cultural sustainability through personal responsible behavior.


Dr. Paula Stefan of Georgia State University spoke on September 10, 2014, regarding economic influence on scientific research in America.  She said that economics in research was a balance of incentives versus costs, simple factors which affect the pursuit of scientific knowledge.

For instance, ninety percent of all research in animals involves mice.  But what do mice cost?  The answer is anywhere from $60 to $3500, depending upon the type of mouse and disease to be studied.  In fact, the need for certainty and for integrity in the selection of those animals is so great that a single breeder has emerged as the best source for those pursuing research.

And breeding alone is not the only expense.  The maintenance cost of mice is 10 to 18 cents a day per mouse.  Dr. Stefan estimated that $1 billion annually is spent on keeping laboratory mice in America.

A company called Cyagen breeds mice in China charging as much as $28,000 a pair for breeding.

The National Institute of Health provides 60 percent of all research funding in America.  Its budget doubled from 1998 – 2002.

There is about $60 billion spent on scientific research in America each year.  But Dr. Stephan said, this has led to American universities being so focused on getting funding that competing research – research which either does not qualify for funding, or which does not pass the scrutiny of government agencies – goes wanting.  The result is that the universities here have become like shopping malls, where students wanting to do research are limited to those projects being funded at the schools.

This has resulted in a relatively narrow focus of research and a relative glut of doctoral students who study more and more the projects they are told to research, and less and less the projects which interest them truly.

In an ironic scheduling, the University of Houston presented a seminar on September 12, 2014, on how graduate students could qualify for National Science Foundation grants during their education.  A panel of professors, some with a history of funded projects and some who have served on screening committees, told students the way to write up funding requests, and of the need to tailor a project to the extent that a screening committee would want to even consider a project, much less actually fund it.

With Dr. Stefan’s observations freshly in mind, it seems that projects need to pass through many filters before being funded.  As those filters sift through the idea, it becomes more an image of the filters, and less the idea originally seen by the student.

View the webcast of Dr. Paula Stephan’s seminar

Link to NSF Research Fellowship Seminar


One of the fields of science that is really booming in recent decades is nanotechnology. This field of science is increasingly becoming prevalent in all of studies. In medical field innovations like targeted therapeutics will help with treatment of diseases like cancer. Instead of destroying all cells in chemotherapy, only targeted bad cells can be destroyed. Some other innovations include brain replication. Human brain is a very complicated machine. To replicate it is a very challenging and risky endeavor. However there are some interesting technological advancements being made in this arena.

“On the hardware side, Gizmag reported advancements with neuromorphic chips which aim to reverse engineer the brain. This goal is trying to be reached through ‘an interdisciplinary amalgam of neuroscience, biology, computer science and a number of other fields that attempts first to understand how the brain manipulates information, and then to replicate the same processes on a computer chip.’” [1]. Since all hardware is useless without software, “researchers at IBM have created a new software program called Corelet which they say operates like the human brain when neuromorphic chips are used for processing” [1].

These innovations are extremely exciting and look like todays sci-fi movies. As a computer scientist, I can only dream of writing code for the human brain. It should be noted that these innovations are in a very early stage of development but they pose potential for great benefits and at the same time enormous misuses. Imagine a point of error and your brain is ‘hacked’ by someone else. Today if just our email address gets hacked, we run around crazy; imagine how helpless we would be if our brains were hacked. Apart from the security aspect, we need to consider ethical and moral aspect also. Our natural brains are extremely qualified and who is to say that the human-designed brains will be moral and ethical.

Scientific and technological advancement is only valuable if done for the right reasons and with right care. If nanotechnology is used to help with early diagnosis and treatment of Alzheimer’s disease or treatment of cancer cells then it is a righteous research endeavor. Today technological and scientific advancement is happening at a very fast pace. Advancements in gene manipulation, tissue engineering and so on are extremely high profile areas. Therefore now more than ever it is extremely important to conduct proper research with proper scientific methodologies. And also it is of utmost important to be transparent in the data and analysis for every research conducted.


Researchers from North Carolina State University have recently developed a wireless biological remote sensing interface that would let you control the movement of your very own cockroach, or let’s say “Robo-roach”! Dr. Alper Bozkurt, an Assistant Professor in the Department of Electrical and Computer Engineering, and his project team came up with a device which put the cockroaches into autopilot mode after mounting a small computer chip on the roach’s back and implanting electrodes to its antennae. Using a remote control, Bozkurt and his colleagues can direct the Madagascar hissing cockroaches where to crawl. 2

roach
According to Dr Bozkurt the aim of the project is to create a wireless robust biological interface with cockroach which can ultimately infiltrate small and difficult remote places. These roaches could carry tiny sensors, cameras and microphones which will help to collect and transmit information in challenging situation, e.g. for finding survivors in building destroyed in an earthquake. Building small scale robot at this level for such uncertain dynamic situation is very challenging and expensive, whereas biobot cockroaches could easily be a better alternative keeping in mind that they are experts to perform in such hostile environments, they already have a natural power process, they live long (almost for two years) and slow movement of Madagascar cockroach is very flexible for a steady control.

The project definitely intends a noble and brilliant idea but at the same time it creates some controversies. First of all like any other animal model used in scientific medical research, especially which involves surgical intervention, it also highlights the old debate of how ethical it is to inflict pain in animals for the benefit of scientific research. The cockroaches are properly anesthetized during the chip implantation and the electric impulses sent through their antennae to stimulate body movement are very tiny and not more than enough to drive the neural circuitry just like its very own natural body impulses. So there is no fear of electrocuting the insect and forcing it to respond to signals because of pain, claims Dr Coby Schal, a professor of entomology at NCSU who works in collaboration with Dr Bozkurt in the project and looks after the ethical side of this biological research. Bozkurt hopes his work will one day mark a time when insects can be used as large mammals were in earlier stages of human civilization. “Information is the new payload,” he said. “In the past, people lived and worked with large mammals like horses and oxen to build entire civilizations. Now, with the technology that we have, insects can carry information that can advance civilization.”3

Michelle Rafacz, assistant professor of biology in the Columbia Science and Math Department, also supports this research and believes creating biobots has important social implications. “Socially, this is extremely practical,” she said. “People do not respect insects in terms of their complexity. They have an incredible sense of smell and sensory capabilities that can play a huge role when it comes to helping people.” 3

Recently based on this research Backyard Brains of Ann Arbor, Michigan, has developed a robotic “backpack” called RoboRoach, selling for $99, which can be attached to a cockroach and with a handy iPhone app you can steer your own cyborg cockroach left or right. They hope the app will inspire a new generation of youngsters, prompting them to become curious about the wonders of neuroscience, and hopefully lead them to someday cure neurological ailments like Alzheimer’s or Parkinson’s disease. The kit comes with guidelines of how to do the “surgery” and handle the roaches with proper hygiene. The application of the kit looks promising for demonstration in laboratories to kids under expert supervision. But the whole idea of selling this kit in a relatively cheap price over Amazon creates fear of heavy misuse and makes it appear as a marketing strategy to make benefit selling fancy new “toys” to kids. Bioethicist Gregory Kaebnick from the Hastings Center in New York told Science magazine that the Roboroach “gives you a way of playing with living things” and finds the product “unpleasant.”4 Many claims that performing amateur surgery on insects is cruel and exactly the opposite of progressive science. The company also admits it has received emails saying the Roboroach teaches “kids to be psychopaths.”4 While some pointed towards the fact that almost all of us would scream for pest control as soon as we see roaches in our house or probably would love to flush them out in toilet, few argued saying that, “Killing a cockroach is one thing. Torturing it like this FOR FUN is another thing altogether.”4 And we should not also forget that with such easy access to cheap personal biobot, it would not be difficult to use this technology for severe misconduct like spying, robbery or even in terrorist attack. So how much good fortune will this Roboroach bring to our society is still a matter of hot debate.

Video link : http://media.brisbanetimes.com.au/technology/tech-talk/iphone-controls-cyborg-cockroach-4822085.html

The cyborg cockroaches reveal a larger issue of the bioethics of animal and cyborgs. In an interview with the NewsHour, Emily Anthes, the author of “Frankenstein’s Cat,” saw developments like the cyborg cockroach as inevitable. She said “I think this meshing of the biotic and the abiotic of living and machine is really the future of biotechnology. And we’re going to see a lot more animals and, frankly, humans that have electronic components integrated with their bodies.”3 So looks like the Roboroachs are here to stay and going to bring a lot more new controversies with them in future.

References:
1. http://ibionics.ece.ncsu.edu/assets/EMBC_12.pdf
2. http://www.ece.ncsu.edu/news/21621
3. http://columbiachronicle.com/cyborg-insects-may-be-tomorrow%E2%80%99s-heroes/
4. http://www.dailymail.co.uk/sciencetech/article-2449562/The-app-lets-control-COCKROACH.html
5. http://www.pbs.org/newshour/rundown/2013/10/cyborg-cockroaches-theyre-alive.html


“Publish or Perish” has become a norm in the academia these days. Not only that, to enter the academia itself, one needs to show a list of publications before he is even considered for a post. Whether a person with higher quantity or quality gets an academic post all relies on the mindset of the committee who decides on who gets the job. After getting an academic post say an assistant professorship, that person has to prove himself again to get in the higher ladders of the academia. For all of this, from the very beginning of getting to a job, to reaching and surviving at one peak level like a tenureship requires ‘publications’.

In order to get as many publications as one wants most people in academia would like to be in big labs with lots of research going on. There will be many graduate students, post-doctoral researchers (post-docs), research professors etc in that lab. With many experiments being conducted by various persons in that lab comes the ethical dilemma of who gets to be an author and who doesn’t in the research papers to be published from that lab. As an example, we can actually take the “Boss” or the professor who owns the lab. There are instances where the generation of ideas, doing the required experiment, writing and reviewing of the paper is mostly done by the graduate students and the post-docs. The so called “Boss” or the Principal Investigator (PI) just attracts funding for the lab or does some editing at the end when the manuscript is ready. So, does that give him the privilege of gaining an authorship? Some may argue that without his money nothing could have been done. It might be true in one sense, but if there are no regular meetings, no regular inputs from that professor (the PI) and it’s the funding that is the only deciding factor, then I think it would be better to put him as a contributor rather than an author of a manuscript.

In medical science, the International Committee of Medical Journal Editors (ICMJE) is a respected group whose recommendations are followed by many of the world leading medical journals. It gives recommendations for the Conduct, Reporting, Editing and Publication of Scholarly Work in Medical Journals. It defines an “author” as a person who fulfills all of the following 4 criteria:

• Substantial contributions to the conception or design of the work; or the acquisition, analysis, or interpretation of data for the work;
• Drafting the work or revising it critically for important intellectual content;
• Final approval of the version to be published;
• Agreement to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Now comes the problem of whether all the authors in a manuscript have fulfilled all these criteria or not. There might be scientists who just looked at certain aspects of the experiment like the statistics. Or, someone else might just have conducted the experiment only like a lab technician or a graduate student. So, comes the question: Will it be ethical to give authorship to all the personnel involved from those who just did some technical work and did not provide any scientific input to some who have worked in all aspects from start to the end of the experiment and writing the paper. There is no clear definition of what we can actually call “substantial contributions” as mentioned in the ICMJE criteria and none of the papers are going to be published without the name of the professor who owns the lab. Whether he always provides that substantial contribution is a big question. Similarly, to get volunteers to participate in a study a lab might take help from other clinicians, who might also want to get an authorship because of the help he did. There might be a person who just edited the paper and did not contribute much to the experiments or other part of the work, but who is an influential name in that area of research. Some researchers may want to put his name in the manuscript so that it gets published. So, there are various forms of unethical practices like this one; which are termed as Gift authorship (authorship given out of favor); Honorary authorship (out of respect); Prestige authorship (to rub off prestige) and the worst one, ghost authorship (where author does not know anything about the research but has a name on it showing he did the work when it was all done by somebody else.) The last one “ghost authorship” is most prevalent in pharmaceutical companies so that the companies can prove that some big shot doctor has found their drug to be very good when in fact that person has done nothing and it is all done by the employees of the pharmaceutical company. These are some of the unethical practices that medical literature is plagued of.

So how can this are stopped? When pride, money and the desire to succeed by any means fills the mind of a scientist there are many instances where he can be unethical. Some of the journals these days have started asking authors to be divided according to the amount or part of contribution they had in the manuscript. Some might have worked in the designing of experiments, others conducting the experiments and yet others analyzing the results etc. Dividing in such a way at least helps the readers to know which author has worked for which part. Although, not fully error free, this might be one step ahead in lessening the unethical practices in authorship.

Finding out how much an author has contributed to a manuscript would be a big research in itself for the editors of journals. With the volume of articles being submitted it is virtually impossible to do that. Thus, it finally boils down to the part of scientist himself whether to be an ethical or unethical author. The culture of “Publish and Perish” has created a big question on ethics of authorship. More methods that can effectively check and balance unethical authorship need to be researched and implemented.


The brain is the most complex organ in our bodies. It consists of millions of inter connected neurons that work together to give us our personalities, motivations, memories, etc. The brain is capable of things beyond our imagination; we have just started to see what it might be capable of doing. However, the complexity of the brain also makes it a very difficult organ to study because we are still unaware of all of its functions. As scientists, we have to be extremely careful about how we deal with the brain and what advances we make public because any wrong decision could become detrimental and costly for the human race.

In recent years, there has been a huge advancement in the field of neuroscience and this advancement has led to areas of research of the brain that may be ethically questionable. Neuroscience has now gone beyond the clinical applications to a variety of new areas that are well beyond our imagination. From the measurement of mental processes using functional imaging to the manipulation of the brain using selective drugs, the new capabilities of neuroscience raises many ethical and social issues and requires us to question how far we will go before putting individuals’ lives in danger. Technological advances in neuroscience have led to innovations in medicine that have therapeutic, as well as non-therapeutic implications that extend beyond areas explored by scientists. We are coming to a point in history where technology that we have invented can go beyond helping just the medical community. The question that scientists need to ask now is what ethical issues arise because of these innovations and what ethical standards should be applied to brain research?

Research done on the brain thus far has greatly improved our ability to understand and treat people with neurological disorders. We now understand many neurological disorders and have came up with various treatments to treat these disorders. There are various drugs on the market that are currently used to improve the mood, cognition, or behavior of people with problems in these areas. However these drugs are now starting to gain the interest of the general public. People are now starting to experiment with these drugs in order to see how they can help them with their normal brain functions. From a science perspective, this growing interest is very dangerous and can lead to many problems in the future. These drugs, which are meant for people with mental disorders, can enhance normal people’s brain activities and therefore have the capability to intervene with normal brain functions. Using these drugs, normal people have the capability to focus more clearly anywhere, be cheerful all the time and even have enhanced memory. What might happen if the industry decided to sell these drugs that are supposed to help individuals with neurological problems to ordinary people? Will society come to the point where we will be medicalizing normal behavior?

Drugs like Ritalin and Adderall are currently used to improve the attention of people with ADHD, but they are also known to enhance attention in healthy individuals. Surveys have shown that many Americans are now buying these drugs from people that they know or from dealers in order to help them enhance brain functions in their daily lives. Everyone from college students to office employees are using these drugs to have some kind of advantage and to help them get ahead of their peers. So far only a few people know about the effects of these drugs and can purchase them. However, it will not be long before these drugs will be provided to the public for brain enhancement. Even though the effects of these drugs sound great, and we would all love to use them in our daily lives, there are many ethical issues that scientist need to think about and address before making these drugs commercially available to the general public. These drugs seem to help people with neurological disorders but we have little research on the long term effects of these drugs even for these people. We are far away from knowing how these drugs will affect ordinary people and therefore cannot allow people to have access to such drugs in order to have brain enhancement.

As we have seen with other scientific advancements in the past, it takes a very long time for the scientific community to be fully aware of the negative connotations associated with the advancement. With the case of tobacco, for instance, thousands of lives were lost due to lung cancer before it became well known that tobacco causes cancer. Tobacco was made available to the public before its effects were fully understood by the scientific community and this lack of knowledge led to many people dying unknowingly. With brain enhancement drugs, are we ready to take that same risk again? These drugs work on the organ that is the seat of our knowledge, the organ that controls our every move. Can we really afford to make the same mistakes again? Side-effects and unintended consequences are always a concern with any type of drug, but neuroscience based enhancement involves intervening in a far more complex system and therefore we face an even greater risk when we intervene with brain functions. With the little knowledge that we have about the long-term side effects of these drugs, it is unsafe for us to make such drugs publically. If we have learned anything as scientists from our past, we know that those who profit from these drugs will always claim that these drugs are safe and will find ways to prove themselves right. We, as scientists, cannot put the lives of millions of people at risk until we have enough proof that these drugs will not cause any harm and will only help those who are taking them.

Neuroscience has also made great advancements in the area of brain imaging. Even though we are not at a point where we can read minds using imaging, the advances that are taking place ensure us that we are not far away from reaching this goal. The neuroimaging techniques are becoming so advanced that it is now possible to infer not only people’s mental states but also their unconscious attitudes and predispositions. Recent FMRI studies show that measurements can now be obtained for complex human processes such as decision making, moral and non-moral social judgments, and even personality.

Scientists are now working towards providing us with a society where everyone will be transparent and no one will be able to hide any bad intentions or cause harm to society. However, these advancements raise many critical ethical issues that need to be addressed. Brain processes and thinking are a very private matter and these advances could jeopardize personal identities and privacy of people in the future. Imagine a legal system where you are forced to have your brain scanned in order to see if you are lying or telling the truth. Imagine your employer routinely performing brain scans to view your intelligence level, mood, and even criminal intentions. Is the public ready to live in a world where everything they think about can be easily accessed by those who have the authority to do so? Should they even be provided with the power and means of authorities to carry out such actions? As scientists, it is our job to look at the long term consequences and implications of such advancements. We have to realize that these advancements in neuroscience can have many negative consequences that could put the public in harm’s way and take away their privacy.

Techniques for manipulating the brain and its functions are advancing at a very fast pace and few people are looking into the consequences associated with these advancements. We do not know how the different systems of the brain interact with each other or the consequences of intervening with normal brain functions. We are not aware of how these interventions effect human beliefs, desires, intentions, emotions, memory, and etc. Should we even be allowed to make such interventions in the brain? Evolution has made us who we are today and maybe there is a reason why we cannot read each other’s minds. Maybe there is a reason why we do not have super memory or the capability to always be happy. The knowledge that we have gained so far is very powerful but we have to remember that there is a lot more to learn before we take huge leaps to advance our society. What little we do know about our brain and its functions may be enough to lead to great advances, but we have to think about how these advances can affect our future. The knowledge that we have gained so far is very powerful, but with this knowledge comes the responsibility to prevent its misuse and abuse.

Follow

Get every new post delivered to your Inbox.