Monday 17th of May 2021

uncertainties, probabilities and certainties, with time limits...

weather   Humanity relies on two main streams of existence in a specific environment. This has been defined in the past as nature and nurture. In more explicit terms, our DNA dictates who we physically are and our social constructs influence the stylistic understanding of our relationships. 

This is also dependent of the quality of the environment in which we live: pollution and destruction of nature are becoming issues of urgent concern.

Over millennia, We have created various understanding of the human condition — some erroneous but advantageous for a few (Churches and Kingdoms) by fostering ignorance of the masses on the reality of their condition. Our latest venture is “universal human rights” in mostly democratic frameworks. At present stage, these UHR and DF are works in progress, and much of these are subjected to corruption, resistance and uncertain advancements. Yet, some of our progress into the domain of probabilities has lead to very complex certainty of understanding, especially the human genome...

Over a few frenzied weeks in the middle of 2000, icing his wrists between coding sessions, Jim Kent, a graduate student at the University of California, Santa Cruz, created a key software tool used in the international effort to sequence the human genome. 

Algorithmic biology unleashed

By Hallam Stevens

GigAssembler pieced together the millions of fragments of DNA sequence generated at labs around the globe, literally making the human genome. At almost the same time, Celera Genomics acquired Paracel, a company that primarily designed software for intelligence gathering. Paracel owned specially designed text-matching hardware and software (the TRW Fast Data Finder) that was rapidly adapted for sniffing out genes within the vast spaces of the genome.

Untangling the jumble of genomic letters required rapidly and accurately searching for a specified sequence within a very large space. This demanded new forms of training and disciplinary expertise. Physicists, mathematicians, and computer scientists brought methods such as linear programming, hashing, and hidden Markov models into biology. Since 2005, the Moore's Law–like growth of next-generation sequencing has generated ever-increasing troves of data and required even faster algorithms
 for indexing and searching. Biology has borrowed “big data” methods from industry (e.g., Hadoop) but has also contributed to pushing the frontiers of computer science research (e.g., the Burrows-Wheeler transform) (12).
The coalescence of bioinformatics and computational biology around algorithms has also given rise to new institutional forms and new markets for biomedicine. Statistically powered “data-driven biology” has configured an emerging medical-industrial complex that promises personalized and “precision” forms of diagnosis and treatment. Algorithmic pipelines that compare an individual's genotype to reference data generate a range of predictions about future health and risk. Direct-to-consumer genomics companies such as 23andMe now promise us healthier, happier, and longer ways of living via algorithms.

This presents substantial challenges for privacy, data ownership, and algorithmic bias (13–15) that must be addressed if genomics is to avoid becoming a handmaiden of “surveillance capitalism” (16). Many tech companies have begun to look toward using machine learning to combine more and more biological data with other forms of personal data—where we go, what we buy, whom we associate with, what we like. The hopes for genomics have long been tempered by fears that the genome could reveal too much about ourselves, exposing us to new forms of discrimination, social division, or control. Algorithmic biology is depicting and predicting our bodies with growing accuracy, but it is also drawing biomedicine more closely into the orbits of corporate tech giants that are aggregating and attempting to monetize data

Science  05 Feb 2021:
Vol. 371, Issue 6529, pp. 564-569

[Jim] Kent began his programming career in 1983 with Island Graphics Inc. where he wrote the Aegis Animator program for the Amiga home computer. This program combined polygon tweening in 3D with simple 2D cel-based animation. In 1985 he founded and ran a software company, Dancing Flame, which adapted the Aegis Animator to the Atari ST,[2] and created Cyber Paint[3] for that machine. Cyber Paint was a 2D animation program that brought together a wide variety of animation and paint functionality and the delta-compressed animation format developed for CAD-3D. The user could move freely between animation frames and paint arbitrarily, or utilize various animation tools for automatic tweening movement across frames. Cyber Paint was one of the first, if not the first, consumer program that enabled the user to paint across time in a compressed digital video format. Later Jim developed a similar program, the Autodesk Animator for PC compatibles, where the image compression improved to the point it could play off of hard disk, and one could paint using "inks" that performed algorithmic transformations such as smoothing, transparency, and tiled patterns. The Autodesk Animator was used to create artwork for a wide variety of video games.[4]

In probability theory, a Markov model is a stochastic model used to model pseudo-randomly changing systems. It is assumed that future states depend only on the current state, not on the events that occurred before it (that is, it assumes the Markov property)

Hidden Markov Model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process – call it  – with unobservable ("hidden") states. HMM assumes that there is another process  whose behavior "depends" on . The goal is to learn about by observing. HMM stipulates that, for each time instance, the conditional probability distribution of given the history must not depend on.

Hidden Markov models are known for their applications to thermodynamics, statistical mechanics, physics, chemistry, economics, finance, signal processing, information theory, pattern recognition — such as speech, handwriting, gesture recognition,[1] part-of-speech tagging, musical score following,[2] partial discharges[3] and bioinformatics.[4]. 

Add Bitcoin and blockchains to the list (Gus).

Information theory is the scientific study of the quantification, storage, and communication of information. The field was fundamentally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.

A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory, and information-theoretic security.

Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL). Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones and the development of the Internet. The theory has also found applications in other areas, including statistical inference,[1] cryptography, neurobiology,[2] perception,[3] linguistics, the evolution[4] and function[5] of molecular codes (bioinformatics), thermal physics,[6] quantum computing, black holes, information retrieval, intelligence gathering, plagiarism detection,[7] pattern recognition, anomaly detection[8] and even art creation.

Here, most of us with a computer have used JPEGs. Some of us still use Zip compression program. But how much do we think about the values of information in a picture? A RAW picture with high resolution can be far too large to be transmitted or even used in day to day life. Most transformation programs cannot alter a RAW file. But a RAW picture can be translated into a tiff format to retain as much as the original data, which then can be modified by programs such as Photoshop. JEPG algorithm compresses the data of tiffs to various chosen degree of RESOLUTION. Most cameras use a direct JPEG conversion to create pictures at various resolution of pixels. High definition TV uses a high number of pixels… until we reach "quantumical" numbers...

All this leads to the strength of BITCOINS and the development of ARTIFICIAL INTELLIGENCE. 


Background and motivation Blockchain is one of the most popular issues discussed extensively in recent years, and it has already changed people’s lifestyle in some real areas due to its great impact on finance  business, industry, transportation, healthcare and so forth. Since the introduction of Bitcoin by Nakamoto [1], blockchain technologies have obtained many important advances in both basic theory and real applications up to now. Readers may refer to, for example, excellent books by Wattenhofer [2], Prusty [3], Drescher [4], Bashir [5] and Parker [6]; and survey papers by Zheng et al. [7], Constantinides et al. [8], Yli-Huumo et al. [9], Plansky et al. [10], Lindman et al. [11] and Risius and Spohrer [12].

Here we have systems of computation that “delete their own past” or "freeze the past in concrete” to prevent corruption of its origination. The next “block(s)” is the only relevant item to the value added. While perceptions can be altered, the blocks definitions cannot. Here there is a certain spooky parallel with DNA. We have the DNA of our ancestors, but “they have been deleted”. We cannot be corrupted by their DNA, UNLESS WE FIDDLE with genetic revival. On the social democratic side, our history is flimsy and subject to wrong assumptions, lies and profitable delusions.

Perception (from the Latin perceptio, meaning gathering or receiving) is the organization, identification, and interpretation of sensory information in order to represent and understand the presented information or environment.[2]

All perception involves signals that go through the nervous system, which in turn result from physical or chemical stimulation of the sensory system.[3] For example, vision involves light striking the retina of the eye; smell is mediated by odor molecules; and hearing involves pressure waves.

Perception is not only the passive receipt of these signals, but it's also shaped by the recipient's learning, memory, expectation, and attention.[4][5] Sensory input is a process that transforms this low-level information to higher-level information (e.g., extracts shapes for object recognition).[5] The process that follows connects a person's concepts and expectations (or knowledge), restorative and selective mechanisms (such as attention) that influence perception.

Perception depends on complex functions of the nervous system, but subjectively seems mostly effortless because this processing happens outside conscious awareness.[3]


Gusnote: In depression, perceptions may not be able to mesh with memory. We loose track of what we see and the memory of identification. WE LOOSE COGNITION (as explained a few times on this site). In democratic societies this can happen from a variety of causes, including conflicting opinions about events to the point of unsustainability.

OUR SOCIAL BLOCK CHAINS are thus not fully secured. OUR BIOLOGICAL BLOCKCHAINS (DNA) are relatively secure, but can be altered by interferences from other blockchains (viruses, bacteria, etc) and are individually TIME LIMITED.
Our perceptions of such have been modified by rigid illusions (religious beliefs) instead of flowing imagination (including scientific research and stylistic art). But the entire humanity isn’t a unit of species and there are notable social variations between "cultures". 


Our next step is to turn uncertainties into probabilities, when possible

BITCOINS: Buying or mining?

What is Bitcoin Mining?

Cryptocurrency mining is painstaking, costly, and only sporadically rewarding. Nonetheless, mining has a magnetic appeal for many investors interested in cryptocurrency because of the fact that miners are rewarded for their work with crypto tokens. This may be because entrepreneurial types see mining as pennies from heaven, like California gold prospectors in 1849. And if you are technologically inclined, why not do it?

• By mining, you can earn cryptocurrency without having to put down money for it.
• Bitcoin miners receive Bitcoin as a reward for completing "blocks" of verified transactions which are added to the blockchain.
• Mining rewards are paid to the miner who discovers a solution to a complex hashing puzzle first, and the probability that a participant will be the one to discover the solution is related to the portion of the total mining power on the network.
• You need either a GPU (graphics processing unit) or an application-specific integrated circuit (ASIC) in order to set up a mining rig.
However, before you invest the time and equipment, read this explainer to see whether mining is really for you. We will focus primarily on Bitcoin (throughout, we'll use "Bitcoin" when referring to the network or the cryptocurrency as a concept, and "bitcoin" when we're referring to a quantity of individual tokens).

The primary draw for many mining is the prospect of being rewarded with Bitcoin. That said, you certainly don't have to be a miner to own cryptocurrency tokens. You can also buy cryptocurrencies using fiat currency; you can trade it on an exchange like Bitstamp using another crypto (as an example, using Ethereum or NEO to buy Bitcoin); you even can earn it by shopping, publishing blog posts on platforms that pay users in cryptocurrency, or even set up interest-earning crypto accounts. An example of a crypto blog platform is Steemit, which is kind of like Medium except that users can reward bloggers by paying them in a proprietary cryptocurrency called STEEM. STEEM can then be traded elsewhere for Bitcoin.

The Bitcoin reward that miners receive is an incentive that motivates people to assist in the primary purpose of mining: to legitimize and monitor Bitcoin transactions, ensuring their validity. Because these responsibilities are spread among many users all over the world, Bitcoin is a "decentralized" cryptocurrency, or one that does not rely on any central authority like a central bank or government to oversee its regulation.

To earn bitcoins, you need to meet two conditions. One is a matter of effort; one is a matter of luck.

1) You have to verify ~1MB worth of transactions. This is the easy part.

2) You have to be the first miner to arrive at the right answer, or closest answer, to a numeric problem. This process is also known as proof of work. 

The good news: No advanced math or computation is involved. You may have heard that miners are solving difficult mathematical problems—that's not exactly true. What they're actually doing is trying to be the first miner to come up with a 64-digit hexadecimal number (a "hash") that is less than or equal to the target hash. It's basically guesswork.

The bad news: It's guesswork, but with the total number of possible guesses for each of these problems being on the order of trillions, it's incredibly arduous work. In order to solve a problem first, miners need a lot of computing power. To mine successfully, you need to have a high "hash rate," which is measured in terms of megahashes per second (MH/s), gigahashes per second (GH/s), and terahashes per second (TH/s).

That is a great many hashes.

The rewards for bitcoin mining are reduced by half every four years. When bitcoin was first mined in 2009, mining one block would earn you 50 BTC. In 2012, this was halved to 25 BTC. By 2016, this was halved again to 12.5 BTC. On May 11, 2020, the reward halved again to 6.25 BTC. In November of 2020, the price of Bitcoin was about $17,900 per Bitcoin, which means you'd earn $111,875 (6.25 x 17,900) for completing a block.3 Not a bad incentive to solve that complex hash problem detailed above, it might seem.


The increasing popularity of Bitcoin mining quickly sparked a fresh debate on the energy use—and the resulting carbon footprint—of the Bitcoin network. Bitcoin mining devices require electrical energy to function, and all devices in the Bitcoin network were already estimated to consume between 78 and 101 terawatt-hours (TWh) of electricity annually prior to the latest surge in the price of Bitcoin (Figure 1). With a growing number of active machines, the network as a whole also requires more power to operate.

Having an estimate of Bitcoin’s future energy consumption also permits a ballpark estimate for the network’s future carbon footprint. To this end, the work of Stoll et al.11 demonstrated that Bitcoin mining had an implied carbon intensity of 480–500 g of CO2 per kWh (gCO2/kWh) consumed. Assuming this number remains constant at 490 gCO2/kWh as the network’s energy demand increases, a total energy consumption of 184 TWh would result in a carbon footprint of 90.2 million metric tons of CO2 (Mt CO2), which is roughly comparable to the carbon emissions produced by the metropolitan area of London (98.9 Mt CO2, according to This number might be higher or lower depending on the locations chosen for Bitcoin mining. Although fossil-fuel-dependent countries like Iran have recently gained popularity as mining sites,9 miners might also try to leverage “greener” sources of power.

Back to the beginning...
Humanity relies on two main streams of existence in a specific environment. This has been defined in the past as nature and nurture. In more explicit terms, our DNA dictates who we are and our social constructs define the stylistic understanding of our relationships.

Now our ability to increase the complexity of some ventures has not eliminated our basic desires of a painless life and of happiness, while we should not destroy the ORIGINAL joint. This is our little planet with the evolution of life into its various inhabitants over 4.5 billion years, including the air quality, global warming and the pollution of the seas. 

Today (18/03/2021), we have been told that a team of scientists in Australia have created a human embryo from a bit of skin. Think about it. 

See also:



a new life...

An Australian-led team of scientists has used human skin cells to create an embryo-like structure, in a discovery that could spark debate on what constitutes life.

The team, led by researchers at Melbourne’s Monash University, reprogrammed skin cells into a 3D cellular structure similar to human blastocysts.

The structures, known as iBlastoids, will be used to model the biology of early human embryos in laboratory settings and underpin research on early miscarriages and IVF.

Previously, studies of early human development and infertility were restricted by the need to source scarcely available blastocysts from IVF procedures.

“iBlastoids will allow scientists to study the very early steps in human development and some of the causes of infertility, congenital diseases and the impact of toxins and viruses on early embryos,” research team leader Professor Jose Polo said.

It will accelerate the understanding and development of new therapies, he said.

‘Human’ debate

However, the discovery could raise questions about what it means to be human and if iBlastoids can even be considered “human”.

The Royal Institution of Australia, a scientific non-for-profit that publishes Cosmos magazine, said it could also prompt a review of regulations governing stem cell and cloning applications.

“It needs to be understood that the Monash team has followed the existing rules concerning stem cell and embryonic research to the letter,” editor-in-chief Ian Connellan said in a statement.

“It’s just that they’ve found a new way to create what is effectively an embryonic structure, without the traditional sperm-egg model.

“That, in itself, is quite amazing and opens up significant avenues of research ... as well as forcing a review of how current rules are applied.”


Read more:


FREE JULIAN ASSANGE NOW !!!!!!!!!!!!!!!!!!