The Echoes of the Mind – Decisions

Decisions

Unlike cognitive judgments, affective feelings often cannot be avoided. One can control the expression of emotion, but not the emotion itself. Affect seems to be more easily noticed and recalled than thoughts. They are less controllable than thoughts.

Once feelings are created, they are less likely to be changed than cognitions. We rarely change our initial impressions of something because we trust our reactions. ~ Robert Zajonc

The driver of all decisions is desire, which is an emotive agent aimed at satisfaction. The very definition of the term rational is reasoning which pleases. The idea of “objective” rationality is laughable. People sharing the same desires does not render cunning about their achievement objective. This is, instead, shared subjectivity.

Determining sound judgment is an exercise in hindsight. Beforehand, any risky decision may be considered unsound.

A calculated risk is simply taking an anticipated risk where the hope of gain wishfully outweighs estimated probability of loss. Estimates never consider system dynamics, which are beyond ken; and anticipation relies upon the availability heuristic (that whatever is considered is a valid appraisal).

When making decisions, we focus on what we are getting and pay scant heed to what we are foregoing. Opportunity cost is an orphan to desire.

Worldview shapes decisions. In the social realm, the most compelling decisions come from looking through a moral lens.

Moral rules bind communities together, enable trust and the division of labor and cause people to behave honestly when no one is watching. Because these rules have such a crucial role in the formation and functioning of human social groups, we are obsessed with their violation. ~ Daniel Gilbert

When behaviors are described as moral violations, apathy transforms into action. Texas highways were awash in litter until 1986, when the state promoted the slogan: “Don’t mess with Texas,” whereupon littering became an insult to pride, and thereby greatly decreased.

Anticipation of pride consistently creates higher pro-environmental intentions than anticipated guilt. ~ American psychologist Elke Weber

Heuristics

People rely on a limited number of heuristic principles which reduce the complex tasks of assessing probabilities and predicting values to simpler judgmental operations. ~ Israeli psychologist Amos Tversky & Daniel Kahneman

A heuristic is a simple, efficient rule employed to form judgments or make decisions; in short, a rule of thumb.

Heuristic efficiency owes to adaptive evolution. The heuristics that organisms rely upon emanate from innate biological faculties, including the inherent structure of their minds. Humans have hundreds of heuristics. They alternately work in concert, reinforce, regulate, or counterbalance others.

Experience is a common heuristic. Spiders relocate their webs to catch prey based upon what has worked in the past. Their rules for web placement are not known, but it is known that experience enhances foraging efficiency.

Many animals rely upon sequential cues to make up their minds. Female sage grouse at a lek assess males first on the quality of their songs. Only if a male is appealing to the ear does a female investigate further. This is an example of an honest signal. Animals cannot fake the quality of their voices. Cricket chirps, frog croaks, and deer stag roars are other examples where virility is honestly expressed vocally. People also rely upon such honest signals to heuristically assess.

The ability of animals to make decisions about where to live is crucial to their fitness. ~ English biologist Simon Mugford

Scouts in colonies of the British ant Leptothorax albipennis look for new nests using a probabilistic geometric technique called Buffon’s needle. A scout explores a potential nest site cavity for relocation via an irregular path that covers the area fairly evenly. A pheromone scent trail helps ensure a mathematically-sound route. The ant then leaves. If the first test proves satisfactory compared with the reports of other scouts, the ant returns to run a different course through the cavity. Though simple, the Buffon’s needle algorithm provides for a remarkably accurate assessment of suitability for a nest.

Honeybees chose their nest sites based upon several criteria, including volume and height above the ground. Through conferencing to communicate relative advantages from information gleaned by different scouts, honeybees make fortuitous decisions about where to relocate their hive based upon a small number of visits by a few surveyors who individually may not have visited all the sites under consideration. Enthusiasm plays a significant role in decision-making.

Social insects have especially efficient heuristics for meeting their needs, from foraging to finding a home, that take advantage of their collective cooperation. A small segment of a population is trusted to make decisions that affect the very survival of the colony. It works because self-interests coincide.

Like other animals, primates favor rules of thumb that suit their mental makeup. The simple reason is that logical reasoning is mentally arduous.

Even modifying heuristics – switching rules – makes people prone to error. The transformation from a practically subconscious heuristic to a different method of reasoning involves a demanding symbolic transformation.

Switching the rules we use is mentally taxing and costly, which leads us to pay less attention to detail, and therefore make more mistakes. ~ American psychologist Hans Schroder

The heuristics humans use to assess probability are often valid, but they can just as easily be erroneous.

We do not see things as they are, we see them as we are. ~ American author Anaïs Nin

Representativeness

The innate tendency to compare and classify inclines the mind to figuring odds in a similar way: by comparative closeness. This heuristic is termed representativeness.

People assume that “like goes with like.” ~ Thomas Gilovich

The past is often indicative of the present and future. But the experience of similarity in categories, instances, and outcomes biases judgment by ignoring outliers: features and circumstances that lead to unusual occurrences. Such clustering can be a formula for being caught out.

Self-organized criticality in dynamic systems is exemplary. Many social systems, such as relationships, groups, and societies, perform as self-organizing gyres. Once such a system passes a threshold condition, situation “normal” is supplanted by a new normal. Anticipating a threshold breach requires not seeing the future as a continuation of the past; in other words, not relying upon representativeness.

Past performance is not necessarily an indicator of future returns. ~ financial market investment maxim

Financial markets exhibit self-organized criticality. Every decade or so there is panic in capital markets.

The essence of investment in a market-based economic system is optimism; whence the use of the term depression for describing both an economic malaise and the social-psychological milieu imbuing it.

While a prescient few see an economic crisis coming, most investors are flummoxed. This is the very reason for the periodic financial panics which have plagued the markets for centuries. (Like the use of the term depression, panic has economic, psychological, and sociological significance.)

With an innate inclination toward patterns, people simply do not expect events to be random. Coin flips have a 50–50 chance of heads or tails every time. Over a large number of tosses, results do even out. Repetition of the same result – a seeming pattern of continuity – is as common as a seemingly random mixture. There is no law of small numbers. Yet people believe that some symmetry prevails in the small as well as the large; that a run of red on the roulette wheel means black becomes more likely. This tendency to see chance as self-correcting is the gambler’s fallacy.

Chance is commonly viewed as a self-correcting process in which a deviation in one direction induces a deviation in the opposite direction to restore the equilibrium. In fact, deviations are not “corrected” as a chance process unfolds, they are merely diluted. ~ Amos Tversky & Daniel Kahneman

Not knowing how a process proceeds skews expectation. Seeing somewhat consistent results in a sample readily leads to a conclusion that something non-random is going on.

Our minds naturally categorize as a learning mechanism for predictability. Categorization is one aspect of working with sets: one of the few that comes naturally. Set theory is not innate, and so its attempted application can lead to logical errors, including conjunction and inclusion fallacies.

The representativeness heuristic influences many of our daily decisions. To judge the likelihood of something, we intuitively compare it with our mental representation of the category. ~ David Myers

The representativeness heuristic is a mental shortcut based upon stereotyping: the mind assumes a likeness based upon previous categorization and proceeds accordingly. Only with unambiguous individuating information which breaks the stereotype does the mind question whether the stereotypical assessment should be made.

3-year-olds make better decisions than 6-year-olds because the representativeness heuristic has yet to take hold. Younger children are parsing all the information available to them, not relying upon learned categorizations, where stereotyping may lead to errors by failing to incorporate individuating information.

Children around 4-years-old are starting to use these shortcuts. By 6-years of age they’re using them at levels as high as adults. ~ Canadian psychologist Samantha Gualtieri

 Inclusion Fallacy

The inclusion fallacy is a variant of the conjunction fallacy. In the inclusion bias, people judge that every member of a set is more likely to have a particular characteristic than in a subset. For instance, people think it more likely that every lawyer is conservative than every labor-union lawyer.

○○○

The conjunction and inclusion fallacies result from representativeness bias, coupled to the fact that statistical probability assessment of sets is not innate, and cognitively clashes with inborn heuristics.

One reason people succumb to prejudices is because representativeness bias simplifies the task of social judgment. ~ Vivian McCann et al

Availability

Another opportunity to err in estimating probability arises by assigning odds or frequency by the ease with which instances can be brought to mind. The more recent and frequent, the more something seems common. This is the availability heuristic, which is useful and statistically unsound.

In one experiment, participants heard a list of well-known people of both sexes, and subsequently asked whether the list had more men or women. Different lists were read to different groups. Some lists had more famous men than women. Others vice versa. Regardless of numbers, people consistently thought that a list had more women or men depending on how heavily loaded the list was with famous personalities of a certain gender.

At least partly because of the availability heuristic, people feel less safe in a plane than in a car, even though auto travel is statistically much riskier.

A related heuristic – imaginability – employs the ease with which an object or circumstance can be imagined.

The risk of an adventure is assessed by imagining contingencies which could not coped with. If many difficulties are vividly portrayed, an expedition becomes dangerous, even as the ease of imagining has no correlation to the probability of occurrence. Conversely, the risk of an undertaking is easily underestimated by failure to anticipate trouble, or if the dangers are difficult to conceive.

Commonly encountered instances are more easily recalled than those that are less frequent, likelier occurrences easier to imagine than unlikely ones, and our mental connection between events is strengthened when the events frequently co-occur. The availability heuristic works for estimating the numerosity of a class, the likelihood of an event, or the frequency of co-occurrences – all by the ease with which the relevant mental operations of retrieval, construction, or association can be performed. Alas, this trusted estimation procedure is prone to systematic errors.

Estimation

Many estimates are based upon an initial value that is adjusted to yield a final result. The starting point may be suggested within the problem, or an initial value may arise from partial computation. In either case, adjustments are inadequate. This owes to anchoring bias: different starting points yield divergent estimates based upon the initial value.

The anchoring effect refers to the situation in which an arbitrarily chosen reference point (anchor) significantly influences the decision makers’ value estimates, and the value estimated is insufficiently adjusted away from the reference point toward the true value of the target of estimation ~ Taiwanese information analyst Chin-Shan Wu et al

 Product Estimation

Given only 5 seconds, 2 groups of high school students individually estimated the value of a multiplicative product.

One group estimated:

1 x 2 x 3 x 4 x 5 x 6 x 7 x 8

The other:

8 x 7 x 6 x 5 x 4 x 3 x 2 x 1

To rapidly answer such questions, people perform a few computation steps, and then extrapolate from there. Because adjustment is typically insufficient, this technique should systematically lead to underestimation. Further, as the first product set begins with lower numbers, its estimation should be a lower value.

So it was. The median answer for the ascending sequence was 512, while the average answer for the descending sequence was 2,250. The correct answer: 40,320.

 Planning

Planning often involves multiple steps or stages. Successful outcomes depend upon accurate estimation, especially when a project is a series of connected (conjunctive) tasks. Estimating sequential events also suffers from anchoring bias.

 Marbles

An experiment tested anchoring bias by type of event. Participants could gamble on 1 of 2 distinct events. 1 of the 2 events was a simple one: drawing from a bag containing marbles, 50% of which were white, 50% red. The other event of choice was compound: a series of elementary events, though of 2 different types: conjunctive and disjunctive.

The conjunctive event was drawing a red marble 7 times in a row, with marble replacement each time, from a bag of 90% red marbles and 10% white marbles.

The disjunctive event was drawing a red marble at least once out of 7 tries from a bag with 10% red marbles and 90% white ones.

The 50–50 simple event had a 50% probability. The connected series (conjunctive) event had a 48% probability. The disconnected (disjunctive) event had a 52% probability.

A significant majority preferred to bet on conjunctive event, which had lower odds than the simple event. Conversely, the strong preference was for the lower-odds, simple event over the disjunctive event.

The simple event is readily understood, and so provides an anchoring for evaluating the compound events via adjustment, which proved inapt. As the results showed, people overestimate the probability of conjunctive events, and underestimate the probability of disjunctive events.

○○○

In a conjunctive project, each event in the series must sequentially transpire. The general tendency to overestimate (the probability of achieving) conjunctive events leads to unwarranted optimism of timely completion.

Even when meeting every conjunctive milestone is highly likely, the overall probability of success can be quite low if there are many events. Any estimation error can have a domino effect.

The ‘law’ of sequential choice (or decision) is that if the number of stages in is held constant, the relative overestimation of the chance of guessing correctly at all stages varies directly with a power of the number of alternatives per stage. If, however, the number of alternatives is held constant, the relative overestimation varies exponentially with the number of stages. ~ English psychologist John Cohen et al

Evaluations of risk typically involve disjunction. A complex structure or system will malfunction if any of its essential components fail.

Even when the likelihood of failure for any component is slight, the probability of malfunction increases with the number of working parts. Because of anchoring, there is a strong bias to underestimate the chance of failure in complex systems.

The chain-like structure of conjunctions leads to overestimation, the funnel-like structure of disjunctions leads to underestimation. ~ Amos Tversky & Daniel Kahneman

Subconscious Input

Decisions are not just the fruitions of conscious desires. The subconscious has its say.

You do things, and only later do you see why you did them, if you ever do. ~ English writer Julian Barnes

Familiarity

People prefer symbols that they have been seen before, even when they do not recall seeing them. Similarly, sounds heard before are preferred to novel ones, even though there may be no overt recognition of previous exposure.

Preference precedes inference. Repeated exposure engenders a comfortable familiarity that engenders preference.

Baby names are fads that run on subconscious affect. Certain names become popular by people hearing the name, and it later coming back to them when the occasion arises to name their newborn.

If people do not recognize that they thought of the name because it has become popular, they are likely to find it pleasing and original. ~ Timothy Wilson

The familiar subconsciously resonates as safe. This is why animals prefer their natal breeding grounds, even when they are degraded compared to readily available alternatives. It is also why humans choose mating partners with character traits from childhood exposure. The familiar may not necessarily be safe, but its familiarity makes it feel predictable, which gives an impression of security.

Intention

If a conscious intention or decision to act actually initiates a voluntary event, then the subjective experience of this intention should precede, or at least coincide, with the onset of the specific cerebral processes that mediate the act. ~ Benjamin Libet

In 1983, American physiologist Benjamin Libet experimentally found that neural activity precedes conscious awareness of a decision; in other words, intention registers physiologically before one knows of it.

Voluntary acts can be initiated by unconscious cerebral processes before conscious intention appears. ~ Benjamin Libet

This finding could not have been more controversial, as it seemingly brought into question the idea of free will.

We feel we choose, but we don’t. ~ English psychologist Patrick Haggard

Thoughts simply arise in the brain. What else could they do? The truth about us is even stranger than we may suppose: the illusion of free will is itself an illusion. ~ American philosopher and cognitive scientist Sam Harris

In dismissing free will, Harris ignores that the mind rules from the subconscious where intent lurks, rising from the pool of desire filled by experience. We may not be in complete control of our mind, but that does not mean the mind does not attend to our interests. Others criticized Libet for his experimental technique; but later experiments came to the selfsame conclusion: cerebral activity precedes awareness of volition.

The mind-body is an entangled complex which is witnessed by consciousness. That the mind acts as an independent agent is well-established, as amply illustrated by nattermind. That mind-brain activity begins before our mind bothers to consciously inform is unsurprising.

What Libet’s and others’ results show is that we are not our mind-bodies. Instead, consciousness is confined within a willful mind-body.

Choices are the fruition of desires, which derive partly from biology, and partly from what our mind informs us is possible: our environmental context. That withstanding, obviously we make decisions which may be fateful.

To claim free will or deny it is silly: a prisoner of the mind is never truly free, but neither is one chained to a train running on a predetermined track. Life is not a black-and-white experience. The technicolor pliability of the mind’s complex workings is what makes living such an opportunistic challenge.

Affect

Opinion is ultimately determined by the feelings, and not by the intellect. ~ Herbert Spencer

Affect lays a heavy hand on the guiding rudder of decision. We choose or do what we like, and only feel the need to rationalize the decision if it does not work out as expected.

Relying on an affective impression can be far easier, and more efficient, than weighing the pros and cons or retrieving from memory many relevant examples, especially when the required judgment or decision is complex or mental resources are limited. This characterization of a mental shortcut leads to labeling the use of affect as a heuristic. ~ American psychologist Paul Slovic

The affect heuristic is emotive decision-making. Reasoning is secondary to attraction or repulsion toward alternatives. The affect heuristic is a mental shortcut: essentially “going on gut instinct.”

Positive inclinations are often adjudged as high benefit at low risk. Conversely, an averted alternative is typically perceived as risky, with marginal benefit.

Correlation & Causality

The human understanding supposes a greater degree of order and equality in things than it really finds. ~ Francis Bacon

The inclination to impute order is built into the human mind: finding patterns is one of the mind’s favorite pastimes.

While the predisposition to patterns and connections leads to discovery, it also creates a strong bias to see correlation from coincidence and causality from correlation. This creates the basis for judgments and decisions from faulty information.

Randomness is a difficult notion for people to accept. When events come in clusters and streaks, people look for explanations and patterns. They refuse to believe that such patterns – which frequently occur in random data – could equally be derived from tossing a coin. So it is in the stock market as well. ~ American economist Burton Malkiel

We can exploit predictable phenomena. Randomness bestows no such leverage. Hence the inclination to see order where none exists.

The innate proclivity to find relations is a primary vehicle for learning, as associations are the basis for augmenting facts into a framework of knowledge. The drawback to this inclination is the difficulty in assimilating novel facts or new paradigms.

Learning new schemas becomes more difficult with age, as experience cumulates. That is why radical discoveries and innovations are so often the province of the young, whose minds are not so vested in convention, even if convention is only of one’s own making.

Associative connection between events is strengthened when the events repetitively co-occur. Coincidence is discounted as correlation is strengthened. If one event precedes another, causality is foisted upon correlation, especially if some linkage can be conceived.

Our difficulty in accurately recognizing random arrangements of events can lead us to believe things that are not true – to believe something is systematic, ordered, and “real” when it is really random, chaotic and illusory. ~Thomas Gilovich

In coming to a conclusion of causality over repeated coincidences, invisible intermediaries are easily ignored. This is part of the mind’s inherent dislike of complexity in favor of simple linkages.

Causality commonly comes from counterfactual simulation. The mind imagines what might have been had the suspected causal agent not intervened. What did not happen is how we attribute what did happen.

 Global Warming

The atmospheric warming effects of industrialization and urbanization have been noticeable since the early 20th century. In this case, causality was clear.

In the 1st decade of the 21st century, the consequences of global warming becoming increasingly apparent, even as the rise in global air temperature languished for over a decade. This seeming pause gave skeptics ammunition to assert that climate change was part of the natural planetary cycle, not the outcome of human efforts. The invisible intermediary in this instance was the Pacific Ocean, which was absorbing tremendous heat, thereby giving temporary respite to the ascent of atmospheric temperature.

Man-made climate change, most pronounced by global warming, is graduating from apparent to relentless. Its public recognition (or lack thereof) and the response (or lack thereof) speak volumes about how people can deny causality to great collective peril.

Accentuate the Positive

Willingness to base conclusions on incomplete or unrepresentative information is a common cause of people’s questionable and erroneous beliefs. ~ Thomas Gilovich

Many beliefs are formed upon affirmative association between 2 variables. Repeated coincidences confirm suspicions. A belief is born.

The issue of proof via sufficient evidence is seldom worried over. Anecdotes are enough to suggest that there is a link between 2 things. That we successfully learned as children new words from sparse examples reinforces our comfort with links that lack statistical foundation.

The common belief that we are more likely to need something once we have thrown it away is surely nonsensical, even as it is easy to think of times when it has been true.

The habit of letting a few instances serve as proof shows how people are comfortable with statistically inadequate thresholds to establish their beliefs. Sample-size and data-set quality are abstractions in light of experience that typically offers only a finite number of examples upon which to form a belief.

It is the peculiar and perpetual error of the human understanding to be more moved and excited by affirmatives than negatives. ~ Francis Bacon

Information which confirms is more easily assimilated than information that does not. Put conversely: information that fails to confirm is not nearly as easily assimilated as information which does.

The previous 2 sentences made the same statement. The first was easier to comprehend because it was positive.

It is easier to understand “humans are emotional” than “non-humans are not emotional.” This shows that confirmatory information is more influential than disconfirmations framed as negations.

Placing greater value on positive instances makes it easier to make associations that are not there. This is furthered by the tendency to focus on seeking confirmation of a hypothesis: favoring confirmation over information which overturns a budding belief.

The framing of a relationship affects judgment. If evaluating similarity, dissimilarities are ignored and vice versa.

Confirmation bias works in the instant milieu to produce an answer that affirms. Because of confirmation bias and the influence of affect, balanced evaluation which yields an accurate correlation is an arduous task.

Confirmation bias is a powerful and all-too-human tendency. ~ American psychologist Robert Johnson et al

 Proof-Positive Cards

In one experiment participants were shown 4 cards, each of which had a letter or number facing up: A, B, 2, and 3. They were told that the cards had a letter on one side and a number on the other.

Participants were then asked, by judiciously turning over the minimum number of cards, to determine whether “all cards with a vowel on one side have an even number on the other.”

The common response was to turn over the A and 2 cards. The 2 card was uninformative: a vowel on the flip side would confirm, but a consonant would be irrelevant.

The 3 card was rarely turned over, even as it could be dispositive: a vowel on the other side would disprove the hypothesis.

Besides illustrating bias toward confirmation, this experiment showed that the tendency to seek information consistent with a hypothesis does not stem from an interest in its truth. Instead, we blithely accentuate the positive.

Economic Decisions

If markets were rational, I’d be waiting tables for a living. ~ American billionaire investor Warren Buffett

Capitalist economic theory hinges upon the assumption that humans maximize material gain. This axiom applies only to those consumed by greed, which is a severe mental illness. Instead, people try to maximize their enjoyment, albeit tinged by fear of the future, in a world where economic security is a rarity, even in the richest countries. Practically, materiality is merely a means to avoiding hardship: a convenience.

Aversion to loss is a strong bias. This leads investors to hold on to financial assets that are losing value. Conversely, to lock in gains, assets that are gaining value are sold.

A person who has not made peace with his losses is likely to accept gambles that would be unacceptable to him otherwise. ~ Daniel Kahneman & Amos Tversky

More generally, human probability evaluation is skewed by faulty risk assessment.

In expected utility theory, the utilities of outcomes are weighted by their probabilities. Choices among risky prospects exhibit several pervasive effects that are inconsistent with the basic tenets of utility theory.

In particular, people underweight outcomes that are merely probable in comparison with outcomes that are obtained with certainty. This tendency, called the certainty effect, contributes to risk aversion in choices involving sure gains and to risk seeking in choices involving sure losses. Overweighting of low probabilities may contribute to the attractiveness of both insurance and gambling. ~ Daniel Kahneman & Amos Tversky

 Lotteries

Lotteries exemplify how badly people assess risk, and how emotions are fueled by the imagination. The California lottery increased ticket sales despite lowering the probability of winning when it raised jackpots, which proved a boon to sales. This stratagem became a trend among state lotteries across the nation.

One US lottery called “Powerball” massively lowers the odds by not paying out until a 60-digit number is matched. The jackpot accumulates until then. As advertised payout increases, more tickets are sold. Fools like to dream big.

○○○

Another bias comes in valuing items in one’s possession higher than those not owned.

The prevalence of the purchase of insurance against both large and small losses has been regarded by many as strong evidence for the concavity of the utility function for money. Why otherwise would people spend so much money to purchase insurance policies at a price that exceeds the expected actuarial cost? ~ Daniel Kahneman & Amos Tversky

There are inconsistencies in risk aversion preference. Insurance payout corresponding to loss is exemplary.

People often choose limited coverage with low or zero deductible over a comparable policy that offers full coverage, albeit at a higher deductible. For items worth insuring, the policy not preferred is a more logical choice.

In simplifying choice between alternatives, people often disregard similarities and focus on differences. Because such factor analysis can be done in more than one way, spotlighting differences can produce inconsistent preferences.

In many instances, expected utility overruns what an uninterested observer would call rational. Desires involve emotional attachments, biases, and subjective assessments that are beyond reason. The human world as it exists is ample testimony to that.

The reliance on heuristics and the prevalence of biases are not restricted to laymen. Experienced researchers are also prone to the same biases when they think intuitively. ~ Amos Tversky & Daniel Kahneman

As representativeness and availability are generally useful, their inclusion in the repertoire of intuitive methods is understandable, even as they can lead to errors in prediction or estimation. In contrast, more sound statistical rules simply do not sink in. Many are educated on the fundamentals of statistics: regression to the mean, the effect of sampling size on variability, and confidence intervals. But the human mind is simply not set up to for statistical analysis the way it is for categorization, frequency recall, and best-guess anchoring, which can quickly render rough approximations that people treat as reliable.

Industrialization instituted a new order of magnitude to human environmental, economic, and social effects, and as well created a need for statistical risk analysis which overwhelms the heuristics which served people well in earlier ages. The human mind is ill-equipped to deal with the consequences of what industrial technology has wrought.