Helix was invented in the August of 2015 by co-founder James Lu, Justin Kao, Scott
Burke.
Helix is a digital application store that is created in order to help a person to read their
genome.
“to empower every person to improve their life through DNA’. Helix represents the
growing security and privacy of personal genetic information. Companies offering DNA
testing services collect vast amount of sensitive data, raising questions about potential
miss use for discrimination, profiling or unauthorized acess.
Bless U2 and Pepper All are two AI-based robotic creations designed to
interact with people. Bless U2 is a robotic priest developed by the
Evangelical Churches in Germany, while Pepper All is a human-like
humanoid robot designed for assistance and companionship purposes.
These robots use advanced algorithms and machine learning to respond to
people's emotions, engage in conversations, and provide support in
different areas of life.
Emotion-sensing facial recognition technology raises privacy and
ethical concerns. It works by analyzing facial expressions in images and
videos to determine a person's emotional state. However, this allows
sensitive personal data to be collected without users' knowledge of how it
will be used or protected. As the technology becomes more advanced and
integrated into society, it poses risks like unwanted surveillance of
emotions and behavior.
Ransomware a type of malware that prevents users from accessing their
system or personal files and demands ransom payment in order to regain
access. As we study this malicious software, we will be able to know its
several factors which led to the introduction of dillemas and policy issues
where Science and Technology involved. And through this, we can also
perceive why studying Science, Technology, and Society is essential.
Textalyzer a device designed to catch drivers distracted by cellphones is
one step closer to being approved in New York state, but it's facing
backlash from privacy advocates who fear it's too intrusive. With the
"Textalyzer," police could get a roadside glimpse into whether drivers were
texting, emailing, browsing social networks, or taking a selfie while behind
the wheel.
China has implemented a social credit system that uses big data to
monitor citizens and assign social credit scores based on their behavior.
Those with low scores face punishments like travel bans, while high scores
receive rewards. The system aims to create a high-trust society but raises
ethical issues around conformity, coercion, and lack of transparency.
Google Clips is a hands-free camera that automatically captures short
video clips when it detects people or pets. It learns to recognize faces over
time to better capture moments with important people. Clips streams
content to a smartphone app instantly without wires. While it aims to
effortlessly capture spontaneous moments, Clips has limitations as it
cannot move or adjust on its own, is only compatible with select phones,
and may raise privacy issues if not used appropriately.
Criminal sentencing software uses algorithms to predict criminal behavior
and risk levels. However, this raises ethical issues around biases in the
algorithms and a lack of transparency. One case involved a man
challenging his high-risk determination, as the proprietary nature of the
algorithm made it impossible to understand the reasons for the result or
dispute its accuracy. This dilemma highlights the societal implications of
algorithms structuring important decisions without accountability.
This emerging ethical dilemma involves the development of "friendbots" - artificial
intelligences programmed to mimic deceased loved ones. As grieving processes move
into the digital realm, researchers predict mainstream viability of such convincing digital
surrogates within a decade. The factors leading to this include people wishing to leave
versions of themselves for others and using friendbots as a form of therapy or closure.
Citizen is an app that allows users to report crimes and emergencies in
their area through livestreamed video. While aiming to keep communities
safe and informed, it raises ethical issues around vigilantism, racial
profiling, and users putting themselves in danger. Studying science,
technology, and society is important for understanding how innovations like
this app impact communities and determining if the benefits outweigh the
disadvantages.
According to research,
the United States
Transportation Security
Administration (TSA)
started developing a
novel surveillance system
dubbed Screening of
Passengers by
Observation
Techniques, or Spot, in
2003 to detect suspected
terrorists by reading their
facial expressions and
behavior.
Paul Ekman, a
psychology professor at
the University of
California, San
Francisco,
developed a technique
for recognizing and
mapping minute facial
expressions to emotions.
This
method was used to train
"behavior detection cops"
to check for signs of
deception in people's
faces.
However, when it was
first introduced in 2007,
the software was beset
with problems.
Officers randomly
referred passengers for
interrogation, and the
few arrests made were
for reasons
unrelated to terrorism.
Even more worrisome,
the program was used to
justify racial profiling.
Ekman tried to distance
himself from Spot by
claiming that his
method was being
misapplied. Others, on
the other hand, believed
that the program's
demise was caused by an
out-
of-date scientific concept
that supports Ekman's
method: that emotions
can be objectively
inferred
via facial analysis.
Using Ekman's method,
technology companies
have recently started to
teach computers to
recognize emotion via
facial expressions. Some
developers predict that
artificial emotion
detection
technologies will not only
beat humans in
recognizing true
emotions via facial
expression analysis,
but that these
algorithms will also
become sensitive to
our innermost emotions,
substantially
boosting our involvement
with our technology.