Friday 29th of November 2024

uncertainties, probabilities and certainties, with time limits...

weatherweather   

Humanity relies on two main streams of existence in a specific environment. This has been defined in the past as nature and nurture. In more explicit terms, our DNA dictates who we physically are and our social constructs influence the stylistic understanding of our relationships. 

 

 

This is also dependent of the quality of the environment in which we live: pollution and destruction of nature are becoming issues of urgent concern.

 

 

Over millennia, We have created various understanding of the human condition — some erroneous but advantageous for a few (Churches and Kingdoms) by fostering ignorance of the masses on the reality of their condition. Our latest venture is “universal human rights” in mostly democratic frameworks. At present stage, these UHR and DF are works in progress, and much of these are subjected to corruption, resistance and uncertain advancements. Yet, some of our progress into the domain of probabilities has lead to very complex certainty of understanding, especially the human genome...

 

 

 

Over a few frenzied weeks in the middle of 2000, icing his wrists between coding sessions, Jim Kent, a graduate student at the University of California, Santa Cruz, created a key software tool used in the international effort to sequence the human genome. 

 

 

Algorithmic biology unleashed

 


By Hallam Stevens

 

 

GigAssembler pieced together the millions of fragments of DNA sequence generated at labs around the globe, literally making the human genome. At almost the same time, Celera Genomics acquired Paracel, a company that primarily designed software for intelligence gathering. Paracel owned specially designed text-matching hardware and software (the TRW Fast Data Finder) that was rapidly adapted for sniffing out genes within the vast spaces of the genome.

Untangling the jumble of genomic letters required rapidly and accurately searching for a specified sequence within a very large space. This demanded new forms of training and disciplinary expertise. Physicists, mathematicians, and computer scientists brought methods such as linear programming, hashing, and hidden Markov models into biology. Since 2005, the Moore's Law–like growth of next-generation sequencing has generated ever-increasing troves of data and required even faster algorithms for indexing and searching. Biology has borrowed “big data” methods from industry (e.g., Hadoop) but has also contributed to pushing the frontiers of computer science research (e.g., the Burrows-Wheeler transform) (12).


The coalescence of bioinformatics and computational biology around algorithms has also given rise to new institutional forms and new markets for biomedicine. Statistically powered “data-driven biology” has configured an emerging medical-industrial complex that promises personalized and “precision” forms of diagnosis and treatment. Algorithmic pipelines that compare an individual's genotype to reference data generate a range of predictions about future health and risk. Direct-to-consumer genomics companies such as 23andMe now promise us healthier, happier, and longer ways of living via algorithms.

This presents substantial challenges for privacy, data ownership, and algorithmic bias (13–15) that must be addressed if genomics is to avoid becoming a handmaiden of “surveillance capitalism” (16). Many tech companies have begun to look toward using machine learning to combine more and more biological data with other forms of personal data—where we go, what we buy, whom we associate with, what we like. The hopes for genomics have long been tempered by fears that the genome could reveal too much about ourselves, exposing us to new forms of discrimination, social division, or control. Algorithmic biology is depicting and predicting our bodies with growing accuracy, but it is also drawing biomedicine more closely into the orbits of corporate tech giants that are aggregating and attempting to monetize data
.

 

 

Science  05 Feb 2021:


Vol. 371, Issue 6529, pp. 564-569

 

 

 

[Jim] Kent began his programming career in 1983 with Island Graphics Inc. where he wrote the Aegis Animator program for the Amiga home computer. This program combined polygon tweening in 3D with simple 2D cel-based animation. In 1985 he founded and ran a software company, Dancing Flame, which adapted the Aegis Animator to the Atari ST,[2] and created Cyber Paint[3] for that machine. Cyber Paint was a 2D animation program that brought together a wide variety of animation and paint functionality and the delta-compressed animation format developed for CAD-3D. The user could move freely between animation frames and paint arbitrarily, or utilize various animation tools for automatic tweening movement across frames. Cyber Paint was one of the first, if not the first, consumer program that enabled the user to paint across time in a compressed digital video format. Later Jim developed a similar program, the Autodesk Animator for PC compatibles, where the image compression improved to the point it could play off of hard disk, and one could paint using "inks" that performed algorithmic transformations such as smoothing, transparency, and tiled patterns. The Autodesk Animator was used to create artwork for a wide variety of video games.[4]

 

 

https://en.wikipedia.org/wiki/Jim_Kent

 

 

 

In probability theory, a Markov model is a stochastic model used to model pseudo-randomly changing systems. It is assumed that future states depend only on the current state, not on the events that occurred before it (that is, it assumes the Markov property)

 

 

https://en.wikipedia.org/wiki/Markov_model

 

 

Hidden Markov Model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process – call it  – with unobservable ("hidden") states. HMM assumes that there is another process  whose behavior "depends" on . The goal is to learn about by observing. HMM stipulates that, for each time instance, the conditional probability distribution of given the history must not depend on.

Hidden Markov models are known for their applications to thermodynamics, statistical mechanics, physics, chemistry, economics, finance, signal processing, information theory, pattern recognition — such as speech, handwriting, gesture recognition,[1] part-of-speech tagging, musical score following,[2] partial discharges[3] and bioinformatics.[4]. 

 

 

Add Bitcoin and blockchains to the list (Gus).

 

 

 

 

 

Information theory is the scientific study of the quantification, storage, and communication of information. The field was fundamentally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.

A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory, and information-theoretic security.

Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL). Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones and the development of the Internet. The theory has also found applications in other areas, including statistical inference,[1] cryptography, neurobiology,[2] perception,[3] linguistics, the evolution[4] and function[5] of molecular codes (bioinformatics), thermal physics,[6] quantum computing, black holes, information retrieval, intelligence gathering, plagiarism detection,[7] pattern recognition, anomaly detection[8] and even art creation.

 

 

https://en.wikipedia.org/wiki/Information_theory

 

 

 

Here, most of us with a computer have used JPEGs. Some of us still use Zip compression program. But how much do we think about the values of information in a picture? A RAW picture with high resolution can be far too large to be transmitted or even used in day to day life. Most transformation programs cannot alter a RAW file. But a RAW picture can be translated into a tiff format to retain as much as the original data, which then can be modified by programs such as Photoshop. JEPG algorithm compresses the data of tiffs to various chosen degree of RESOLUTION. Most cameras use a direct JPEG conversion to create pictures at various resolution of pixels. High definition TV uses a high number of pixels… until we reach "quantumical" numbers...

 

 

 

All this leads to the strength of BITCOINS and the development of ARTIFICIAL INTELLIGENCE. 

 

 

--------------

 

 

Introduction 

 

Background and motivation Blockchain is one of the most popular issues discussed extensively in recent years, and it has already changed people’s lifestyle in some real areas due to its great impact on finance  business, industry, transportation, healthcare and so forth. Since the introduction of Bitcoin by Nakamoto [1], blockchain technologies have obtained many important advances in both basic theory and real applications up to now. Readers may refer to, for example, excellent books by Wattenhofer [2], Prusty [3], Drescher [4], Bashir [5] and Parker [6]; and survey papers by Zheng et al. [7], Constantinides et al. [8], Yli-Huumo et al. [9], Plansky et al. [10], Lindman et al. [11] and Risius and Spohrer [12].

 

 

https://computationalsocialnetworks.springeropen.com/track/pdf/10.1186/s40649-019-0066-1.pdf

 

 

 

 

Here we have systems of computation that “delete their own past” or "freeze the past in concrete” to prevent corruption of its origination. The next “block(s)” is the only relevant item to the value added. While perceptions can be altered, the blocks definitions cannot. Here there is a certain spooky parallel with DNA. We have the DNA of our ancestors, but “they have been deleted”. We cannot be corrupted by their DNA, UNLESS WE FIDDLE with genetic revival. On the social democratic side, our history is flimsy and subject to wrong assumptions, lies and profitable delusions.

 

 

 

 

Perception (from the Latin perceptio, meaning gathering or receiving) is the organization, identification, and interpretation of sensory information in order to represent and understand the presented information or environment.[2]

All perception involves signals that go through the nervous system, which in turn result from physical or chemical stimulation of the sensory system.[3] For example, vision involves light striking the retina of the eye; smell is mediated by odor molecules; and hearing involves pressure waves.

Perception is not only the passive receipt of these signals, but it's also shaped by the recipient's learning, memory, expectation, and attention.[4][5] Sensory input is a process that transforms this low-level information to higher-level information (e.g., extracts shapes for object recognition).[5] The process that follows connects a person's concepts and expectations (or knowledge), restorative and selective mechanisms (such as attention) that influence perception.

Perception depends on complex functions of the nervous system, but subjectively seems mostly effortless because this processing happens outside conscious awareness.[3]

 

 

https://en.wikipedia.org/wiki/Perception

 

 

——————————

 

 

 

Gusnote: In depression, perceptions may not be able to mesh with memory. We loose track of what we see and the memory of identification. WE LOOSE COGNITION (as explained a few times on this site). In democratic societies this can happen from a variety of causes, including conflicting opinions about events to the point of unsustainability.

 

 

OUR SOCIAL BLOCK CHAINS are thus not fully secured. OUR BIOLOGICAL BLOCKCHAINS (DNA) are relatively secure, but can be altered by interferences from other blockchains (viruses, bacteria, etc) and are individually TIME LIMITED.

 

Our perceptions of such have been modified by rigid illusions (religious beliefs) instead of flowing imagination (including scientific research and stylistic art). But the entire humanity isn’t a unit of species and there are notable social variations between "cultures". 

 

 

 

—————————————

 

 

 

Our next step is to turn uncertainties into probabilities, when possible

 

 

 

BITCOINS: Buying or mining?

 

 

What is Bitcoin Mining?

 


Cryptocurrency mining is painstaking, costly, and only sporadically rewarding. Nonetheless, mining has a magnetic appeal for many investors interested in cryptocurrency because of the fact that miners are rewarded for their work with crypto tokens. This may be because entrepreneurial types see mining as pennies from heaven, like California gold prospectors in 1849. And if you are technologically inclined, why not do it?


KEY TAKEAWAYS

• By mining, you can earn cryptocurrency without having to put down money for it.

• Bitcoin miners receive Bitcoin as a reward for completing "blocks" of verified transactions which are added to the blockchain.

• Mining rewards are paid to the miner who discovers a solution to a complex hashing puzzle first, and the probability that a participant will be the one to discover the solution is related to the portion of the total mining power on the network.

• You need either a GPU (graphics processing unit) or an application-specific integrated circuit (ASIC) in order to set up a mining rig.

However, before you invest the time and equipment, read this explainer to see whether mining is really for you. We will focus primarily on Bitcoin (throughout, we'll use "Bitcoin" when referring to the network or the cryptocurrency as a concept, and "bitcoin" when we're referring to a quantity of individual tokens).


The primary draw for many mining is the prospect of being rewarded with Bitcoin. That said, you certainly don't have to be a miner to own cryptocurrency tokens. You can also buy cryptocurrencies using fiat currency; you can trade it on an exchange like Bitstamp using another crypto (as an example, using Ethereum or NEO to buy Bitcoin); you even can earn it by shopping, publishing blog posts on platforms that pay users in cryptocurrency, or even set up interest-earning crypto accounts. An example of a crypto blog platform is Steemit, which is kind of like Medium except that users can reward bloggers by paying them in a proprietary cryptocurrency called STEEM. STEEM can then be traded elsewhere for Bitcoin.


The Bitcoin reward that miners receive is an incentive that motivates people to assist in the primary purpose of mining: to legitimize and monitor Bitcoin transactions, ensuring their validity. Because these responsibilities are spread among many users all over the world, Bitcoin is a "decentralized" cryptocurrency, or one that does not rely on any central authority like a central bank or government to oversee its regulation.

 

 

 

To earn bitcoins, you need to meet two conditions. One is a matter of effort; one is a matter of luck.

1) You have to verify ~1MB worth of transactions. This is the easy part.

2) You have to be the first miner to arrive at the right answer, or closest answer, to a numeric problem. This process is also known as proof of work. 

 

...

 

The good news: No advanced math or computation is involved. You may have heard that miners are solving difficult mathematical problems—that's not exactly true. What they're actually doing is trying to be the first miner to come up with a 64-digit hexadecimal number (a "hash") that is less than or equal to the target hash. It's basically guesswork.

The bad news: It's guesswork, but with the total number of possible guesses for each of these problems being on the order of trillions, it's incredibly arduous work. In order to solve a problem first, miners need a lot of computing power. To mine successfully, you need to have a high "hash rate," which is measured in terms of megahashes per second (MH/s), gigahashes per second (GH/s), and terahashes per second (TH/s).

That is a great many hashes.

 

 

 

The rewards for bitcoin mining are reduced by half every four years. When bitcoin was first mined in 2009, mining one block would earn you 50 BTC. In 2012, this was halved to 25 BTC. By 2016, this was halved again to 12.5 BTC. On May 11, 2020, the reward halved again to 6.25 BTC. In November of 2020, the price of Bitcoin was about $17,900 per Bitcoin, which means you'd earn $111,875 (6.25 x 17,900) for completing a block.3 Not a bad incentive to solve that complex hash problem detailed above, it might seem.

 

 

https://www.investopedia.com/tech/how-does-bitcoin-mining-work/

 

 

COST TO THE PLANET:

 

 

The increasing popularity of Bitcoin mining quickly sparked a fresh debate on the energy use—and the resulting carbon footprint—of the Bitcoin network. Bitcoin mining devices require electrical energy to function, and all devices in the Bitcoin network were already estimated to consume between 78 and 101 terawatt-hours (TWh) of electricity annually prior to the latest surge in the price of Bitcoin (Figure 1). With a growing number of active machines, the network as a whole also requires more power to operate.

 

 

Having an estimate of Bitcoin’s future energy consumption also permits a ballpark estimate for the network’s future carbon footprint. To this end, the work of Stoll et al.11 demonstrated that Bitcoin mining had an implied carbon intensity of 480–500 g of CO2 per kWh (gCO2/kWh) consumed. Assuming this number remains constant at 490 gCO2/kWh as the network’s energy demand increases, a total energy consumption of 184 TWh would result in a carbon footprint of 90.2 million metric tons of CO2 (Mt CO2), which is roughly comparable to the carbon emissions produced by the metropolitan area of London (98.9 Mt CO2, according to citycarbonfootprints.info). This number might be higher or lower depending on the locations chosen for Bitcoin mining. Although fossil-fuel-dependent countries like Iran have recently gained popularity as mining sites,9 miners might also try to leverage “greener” sources of power.

 

 

https://www.cell.com/joule/fulltext/S2542-4351(21)00083-0

 

 

Back to the beginning...

 

Humanity relies on two main streams of existence in a specific environment. This has been defined in the past as nature and nurture. In more explicit terms, our DNA dictates who we are and our social constructs define the stylistic understanding of our relationships.

 

 

Now our ability to increase the complexity of some ventures has not eliminated our basic desires of a painless life and of happiness, while we should not destroy the ORIGINAL joint. This is our little planet with the evolution of life into its various inhabitants over 4.5 billion years, including the air quality, global warming and the pollution of the seas. 

 

 

Today (18/03/2021), we have been told that a team of scientists in Australia have created a human embryo from a bit of skin. Think about it. 

 

 

See also: http://www.yourdemocracy.net.au/drupal/node/33702

 

 

END OF TRANSMISSION

 

 

PLEASE, FREE JULIAN ASSANGE !!!!!

a new life...

An Australian-led team of scientists has used human skin cells to create an embryo-like structure, in a discovery that could spark debate on what constitutes life.

The team, led by researchers at Melbourne’s Monash University, reprogrammed skin cells into a 3D cellular structure similar to human blastocysts.

The structures, known as iBlastoids, will be used to model the biology of early human embryos in laboratory settings and underpin research on early miscarriages and IVF.

Previously, studies of early human development and infertility were restricted by the need to source scarcely available blastocysts from IVF procedures.

“iBlastoids will allow scientists to study the very early steps in human development and some of the causes of infertility, congenital diseases and the impact of toxins and viruses on early embryos,” research team leader Professor Jose Polo said.

It will accelerate the understanding and development of new therapies, he said.

‘Human’ debate

However, the discovery could raise questions about what it means to be human and if iBlastoids can even be considered “human”.

The Royal Institution of Australia, a scientific non-for-profit that publishes Cosmos magazine, said it could also prompt a review of regulations governing stem cell and cloning applications.

“It needs to be understood that the Monash team has followed the existing rules concerning stem cell and embryonic research to the letter,” editor-in-chief Ian Connellan said in a statement.

“It’s just that they’ve found a new way to create what is effectively an embryonic structure, without the traditional sperm-egg model.

“That, in itself, is quite amazing and opens up significant avenues of research ... as well as forcing a review of how current rules are applied.”

 

Read more:

https://7news.com.au/technology/science/aust-led-team-create-embryo-like-models-c-2377019

 

FREE JULIAN ASSANGE NOW !!!!!!!!!!!!!!!!!!

of nurture and abuse...

In July 1934, psychologist Harold Skeels evaluated two toddlers at the Iowa Soldiers’ Orphans’ Home in Davenport, Iowa, a Dickensian Civil War–era facility that was both a residence for abandoned children and the state’s central adoption facility. Skeels had been tasked to use intelligence quotient (IQ) tests to identify the “imbeciles,” “morons,” and “idiots” among the hordes of children at the home and shunt them to institutions for the “feebleminded.” (This review, like the book at hand, will use the terminology of the era.)

As infants, the two toddlers under evaluation—13-month-old CD and 17-month-old BD—had been taken by the state from their mothers, a prostitute and an inmate in an insane asylum, respectively, and placed at the Davenport home, where they were “scarcely touched, never held, rarely spoken to,” as Marilyn Brookwood, the author of the excellent The Orphans of Davenport, writes. When Skeels performed an IQ test, the girls scored 46 and 35. (An individual score of 90 to 109 was considered average intelligence.) Skeels tried to send the girls away, but facilities for the feebleminded were overcrowded. In desperation, he accepted an unusual offer: The toddlers would move to an institution that housed feebleminded adults, where they would be cared for by adult women with mental ages of 5 to 9 years.

The women at the Woodward State Hospital for Epileptics and School for the Feebleminded lavished the children with affection. Nine months later, Skeels was astonished to find CD and BD “alert, attractive, playful, [behaving] like any normal toddlers,” Marie Skodak, a colleague who accompanied him that day, later wrote. After another 18 months living with the women, CD’s IQ score was 95 and BD’s was 93. The girls were returned to Davenport and adopted within months. When located again in their late 20s, both were married with children in apparently stable, loving households.

CD and BD are among the many children whom Skeels and a handful of colleagues at the University of Iowa’s Iowa Child Welfare Research Station studied in the 1930s. Benefiting from the station’s milieu of intellectual freedom, these scientific heroes of The Orphans of Davenport developed a body of work finding that neglected children placed in caring and stimulating environments could recover tens of IQ points; that institutional neglect eroded, but preschool improved, children’s IQ scores; and that institutionalized babies born to low-IQ parents and adopted in the first months of infancy scored in the good or superior range on later IQ tests.

The suggestion that children’s IQ scores changed with time and circumstance set off a frenzy of reaction in the young, insecure discipline of academic psychology. Its eugenicist establishment was convinced that intelligence was a fixed and heritable trait—a belief that generated between 60,000 and 70,000 forced sterilizations in the United States in the 20th century. Led by Stanford University psychologist Lewis Terman, the field’s leaders launched sustained, vicious, and humiliating attacks on the Iowa group.

It would be 30 years before the validity of the Iowa group’s clinical findings and their implications—that nurture as well as nature plays a pivotal role in the development of children’s intelligence—were at last recognized and celebrated by their profession. But Skeels and his Iowa colleagues—chief among them Skodak, the outspoken daughter of Hungarian immigrants; Beth Wellman, who conducted pioneering preschool studies before dying prematurely; and the research station’s director, George Stoddard, who would go on to become the chancellor of New York University—made a lasting impact all the same. Their work caught the attention of the Kennedy and Johnson administrations, helping to launch Head Start, the US government program for disadvantaged preschoolers.

The quiet courage of the Iowa researchers illuminates this story, not least when others admit that they failed to rise to the same challenge. “I feel guilty” for finding results similar to those of the Iowa group but not publicizing them in the 1930s, recalled leading children’s development scholar Lois Barclay Murphy decades later. “I knew that Skeels and Skodak were sound, but with the whole establishment against us I didn’t think there was any point in trying to convince people.”

In chronicling a major intellectual battle of the 20th century, The Orphans of Davenport offers scientists a cautionary, timeless tale about groupthink’s power to subvert the dispassionate analysis of new findings. It is also yet another sobering reminder of how specious science can be wielded to justify evil ends—with the attendant suffering of those least able to defend themselves.

 

Read more:

Science (magazine) 27/07/2021

 

Meanwhile:

 

Most of us have an idea – or think we have an idea – of what psychedelics do to us. At their trippy best, drugs such as LSD and magic mushrooms can lead you to feel at one with the universe and awash with creativity. It was under the influence of LSD and peyote derived from cactuses that author Ken Kesey, for instance, wrote One Flew Over the Cuckoo’s Nest, tipped paint into a stream and dipped T-shirts in it (creating tie-dye), and discovered the world was “a hole filled with jewellery”.

There are, of course, bad trips, the extremes of which are perhaps best described by the late Boston crime boss James “Whitey” Bulger who, while incarcerated in an Atlanta penitentiary in 1957, was forcibly injected with LSD as part of the United States’ Central Intelligence Agency’s now infamous behaviour control experiments. “We experienced horrible periods of living nightmares and even blood coming out of the walls,” Bulger wrote.

In 1963, psychology professor Timothy Leary, who had started listing his profession on academic forms as “ANGEL”, was booted out of Harvard University for his research into psychedelics, notably LSD and psilocybin, the active ingredient in magic mushrooms. He gave the psilocybin to undergraduate students when the university had agreed he could only give it to graduate students in his studies – as a tool in psychotherapy and “mind expansion”. While psychedelics were a hallmark of the counter-cultural hippie movement, by 1970 they had been criminalised, thanks to US president Richard Nixon’s war on drugs.

In Australia, psychedelics were used by doctors to treat various psychological conditions – none of it systematically documented – but they began to be criminalised from 1970 onwards and remain, mostly, illegal. So, how is it that we’re now in the middle of a psychedelic renaissance?

In March 2021, the Australian federal government announced it would provide $15 million for clinical trials to determine whether psilocybin and MDMA could help treat debilitating mental illness. In July, a new privately funded research centre was launched in Melbourne to develop psychedelic medicines. Meanwhile, leading research is underway in a Melbourne hospital into the use of psilocybin to treat end-of-life anxiety and depression in terminally ill patients.

These trials follow an enormous amount of psychedelics research over the past 20 years, in the US and Europe, which has led to promising findings about the role they might play in treating conditions ranging from severe depression and post-traumatic stress disorder (PTSD) to addiction, Alzheimer’s disease and anorexia nervosa. In fact, in 2021 there were about 100 psychedelics trials worldwide, at prestigious institutions such as Johns Hopkins University in the US, which launched a Centre for Psychedelic and Consciousness Research in 2020, and Imperial College London, which opened its Imperial Centre for Psychedelic Research in 2019.

What are these studies hoping to find? Could these mind-altering drugs be the long-sought answer to alleviating suffering caused by mental illnesses where other treatments have failed? Do they reveal the secrets of the universe? And what are the risks?

 

Read more:

https://www.smh.com.au/lifestyle/health-and-wellness/at-one-with-the-universe-how-can-psychedelic-drugs-help-treat-suffering-20210614-p580vq.html

 

Read from top. 

 

See also:

in tune with the bits of the universe...

 

FREE JULIAN ASSANGE NOW ∞∞∞∞∞∞∞∞∞∞∞∞∞∞∞∞∞∞