A-STYLE | MEDIA MATTERS
  • News & Updates
  • For Patrons!
  • Editorials
  • Reviews
    • My Grading Scale
    • Top Lists
  • Behind the Scenes
    • About
    • FAQ
  • Fan-Works

The Taxonomic Effect of Corpse Embalming & the American Vampire Panic on Modern Pop-Cultural Vampires

10/13/2022

0 Comments

 
            In terms of our modern picture of monstrosity, a triumvirate of supernatural creatures reigns supreme: witches, werewolves, and vampires. Each has permeated pop-culture to the point that they are often discussed as if they are real creatures with specific traits that biologically define their place as a species within the pantheon of Nature. Of the top three species in the pop-culture hierarchy, the average layperson has the most confidence in discussing the mythic origins of such creatures when talking about vampires. The modern vampire story, clearly, comes from Bram Stoker’s Dracula, they say, and that in turn came from old stories and folklore regarding the real aristocrat it was quite apparently based on.
            So, is that a wrap then? The mysterious origins of one of pop-culture’s most pervasive supernatural creatures gets solved, just like that?
            Of course not; things are never allowed to be that simple in a horror story.
            It is absurdly easy to debunk the idea that Stoker’s character of Count Dracula is more than passingly related to legendary roots of its collective attribution. While the ‘little dragon’ of Wallachia, a vicious prince who grew into the infamous Vlad the Impaler, lent his childhood nickname to the grand work Stoker was embarking on, Prince Vlad Tepes gave little else to the novel. In fact, Vlad’s apparent demotion from Governing Prince to backwater Count in Stoker’s notes far predates the character taking his name—according to Peter Haining, the rank was one of the first decisions Stoker made regarding his character and it comes from Stoker’s own reading of Elizabeth Grey’s 1828 serial “the Skeleton Count; or, the Vampire’s Mistress” (Stoker, et al. 2008, p.21; Miller 2006). The name Stoker chose for his central figure is one he apparently pulled out of a book on Romanian history, sometime around the summer of 1890, and his own notes display exceedingly little awareness of the historical figure as Stoker was under the impression that the word simply meant ‘devil’ in the Wallachian language (Stoker, et al. 2008; Miller 2006)1. The name Transylvania likely came out of the same imperfect book, plucked without context as a suitable word for the story’s setting, one that was exotic enough but that Stoker found more palatable to his potential readers than ‘Wallachia’ (Miller, 1999).
            Where did our modern vampire really come from, then?


​Origins of the Vampire



            The word ‘vampyre’ first appears in English in 1732, in a London periodical reporting on an Austrian investigation into a spate of strange happenings in Eastern Europe (Miller, 2006). The word derives from the Serbian ‘vampyri’, simply meaning ‘back from the dead’, and making its way west through German via reports by Hapsburg health officials like Kameral Provisor Frombald (1725) and Field Surgeon Johannes Flückinger (1731), along with the sensationalized gossip that spread through a number of adjacent countries (Wilson, 2020; Groom, 2018).
            The unfortunate subjects of these reports, men like Petar Blagojević and Arnaud Paole, were both victim and villain— victim due to a tragic death of their own, and villain as they imparted the same tragedy of sudden death upon others of their villages. Blagojević’s widow made the first claims of her deceased husband returning to “set himself upon and throttle” 10 villagers over 10 successive nights, all of whom suffered a brief and mysterious illness before dying themselves (Wilson, 2020). Likewise, Paole, after breaking his neck in a wagon accident, purportedly proceeded to spend the next 40 nights tormenting 4 villagers, all of whom then died of a mysterious illness (Doughty, 2021).
            From a literal historical perspective, there are a number of rationales behind what catalyzed the European Vampire Panic out of an epidemic of vague revenant rumors and age-old superstition. Towards the end of the 17th Century, cemeteries were filling up—leaving the dead competing for space with living inhabitants as proto-industrialization tipped the balance towards rapid urbanization—which lead to longer waits before a body’s final interment, and brought people already intimately familiar with death all the closer to it while additionally allowing for quick, direct comparisons of decomposition rates between various uninterred corpses (Miller, 1999). Furthermore, the Age of Enlightenment had brought forth the Scientific Method and generated a new sort of faith in the idea that science could solve problems once thought to be of magical origin—which meant that in the evermore rare case that science failed to explain a threatening phenomenon, adherence to superstitious customs redoubled (Wilson, 2020; Miller 1999). This meant that when Frombald and Flückinger were sent by the Hapsburg Empress Maria Theresa to uncover the disease or other such scientific cause behind this silly vampire panic, and then as they proceeded to fail at demystifying the issue while the folk remedies managed to generate an illusion of eliminating the threat (as in, an action was taken and, shortly afterwards, people stopped dying—close enough in time to apparently correlate the result with the action being taken)2, the folk story was elevated to near-fact.
            At the same time, the mid-1720’s and early-1730’s were both particularly severe spikes within a brutal climatological shift known as the Mini Dryas Period or the Little Ice Age (Knight & Harrison, 2014). The winters of this period were utterly ruthless and the summers weren’t much better, with sudden snowstorms sporadically causing delays and damage through July across the globe. The consequences of this global cooling are manifold and cannot be understated, regarding nearly any inquiry of the period. In terms of vampires, this unexpected freeze had several significant impacts. Firstly, the extreme and persisting cold simply preserved the bodies for far longer than generally expected. Secondly, the string of forbidding winters and poor growing seasons created a pervading shortage of raw calories and nutrition, which then led to lowered immune responses and the ability for disease to ravage families and whole communities over the course of several weeks—a timeline that conveniently sits within the average duration of a localized vampire panic.
            This lack of nutrition additionally affected the condition of the corpses while they awaited interment. Corpses without much adipose (fatty, subcutaneous tissue) generally begin to decay more slowly than those with an abundance, but in cold conditions, rigor mortis often creates microscopic tears in the muscles being contracted with such tears being most severe in cases where the ratio of muscle to adipose is greatest, which then accelerates the subsequent autolysis (essentially, the liquefaction & dissolution) of muscle tissue (Almulhim & Menezes, 2022). This means that those who performed manual labor and failed to eat enough to gain fat deposits began to decompose more quickly than someone slightly better off.

            Yet another consequence of the chill resulting from the Little Ice Age on decomposition at the time can be seen in another aspect of delayed putrefaction due to the presence of adipocere (corpse wax). Adipocere is formed of fatty tissues when subjected to anerobic environments with significant temperature differentials (Almulhim & Menezes, 2022; Doughty, 2018). Though, generally, a warm environment promotes the development of substantial adipocere leading to saponification (essentially, the petrification of an organic body into a soap-like substance rather than stone), in cooler circumstances adipocere may form in smaller amounts under the skin, leading to a grayish, waxy appearance, along with slowed autolysis, irregular presentation of livor mortis (significant discoloration of a corpse as blood settles), and delayed putrefaction (essentially, organ liquefaction and connective tissue dissolution) (Doughty, 2018). Irregular livor mortis could lead to an imitation of rosy cheeks and fresh bruising, sufficient to appear ‘life-like’ to a personally offended victim who believes they ‘fought the vampire off ’ and who is presenting the lively freshness and new bruising as evidence to an already half-convinced audience. And the delayed putrefaction not only gives the appearance of a lack of decay, but it may lead to uneven decay in which the abdominal cavity may bloat but the skin remains taught and pliable enough to prevent immediate rupture or perforation, generating the imitation of a stomach being over-full in gluttony after a massive feast (Almulhim & Menezes, 2022; Doughty, 2018). All of these elements are factors that tie into the vampire fear as ‘evidence’ because the threshold for establishing viable correlation would be exceedingly low when people attempt to shape their observations to suit a pre-determined outcome. The jump from viable correlation to plausible causation would be a comparatively small step.
            Additionally contributing to this apparent lack of decay might be the lead and arsenic contaminants in preserved food, like soldiers’ rations, and in the predominant make-up of the time—things that a merchant traveler like Blagojević or a former soldier like Paole might’ve encountered more readily than their fellow villagers. Their relative positions within society and the comparatively itinerant lifestyles of their occupations both constitute yet another factor behind what isolated them as individuals more likely to be viewed as vampires. Folklorists like Jeffrey Jerome Cohen have made extensive study of where the concepts of monstrosity come from and what leaps of logic take such an idea from vague shadows haunting the background to a solid incarnation of reality. Cohen notes that “monstrosity dwells at the gate of difference”—it’s significant enough an idea that it earns the fourth spot among Cohen’s Seven Theses (1996, p.7). Neuroendocrinologists investigation the social consequences of psychology and neural biology like Robert M. Sapolsky cite the Us/Them-ing instinct as one of the most fundamental pieces of how nearly all biological entities with higher-order intelligence formulate even the most basic kinds of cooperative groups. Sapolsky elaborates that humans take the Us/Them-ing notion to a level beyond the pale, constructing view-points of pseudo-speciation on a regular basis, such as when wartime combatants refer to the opposing side only as ‘the enemy’ or ‘vile dogs’, etc (2017). Pushing pseudo-speciation a step further by taking the Other, already discussed in terms rendering them as less than human, and declaring them to be a literal monster in the form of a vampire, is honestly par for the course. Cohen asserts this notion forcefully, explaining that not only is the very concept of the monster an exacerbation of instinctive othering, but it exists as a reflexive fear-mongering within a given community to such an extent that they will proactively generate an Other in times of stress, saying, “the monster is difference made flesh, come to dwell among us” (Cohen, 1996, p.7).
            Blagojević and Paole and countless other accused vampires like them may have only been considered outsiders at a stretch, but when their communities were facing an intangible, unexplainable stressor without a clear cause or enemy to fight, the boundaries of the communities’ ‘Us’ contracted to leave them as the definitive Other. The rise of mass media, and the rapid dissemination of quote-accurate communication that rise generated, allowed the individual cases of Blagojević and Paole to proliferate the collective awareness of society in a manner previously unimaginable. Thus, these cases gave rise to the Vampiric Condition as a distinct malady, which therefore created the vampire as a discrete creature, out of a simple word for ‘back from the dead’ in the local language, eclipsing all other forms and traditions of variegated revenant as its specific sensationalism grew.

Taxonomic Considerations

            This sort of rapid-transmission language divergence is seen in the codifying of plenty other mythical creatures, as words for what were once considered nearly the same condition of being back from the dead began to pop up in English as a proliferation of discrete, uniquely definable entities. The homogenous ‘unquiet dead’ of countless cultures became distinctive creatures by name, as can easily be experimented with by running the English words for them through as inelegant a tool as Google Translate. Banshees, poltergeists, ghosts, revenants, and numerous others have all become distinct entities, but their original qualitative variation was minimal and, quite likely, simply layered over with discerning features by English speakers in a process of semantic nomination (Sapolsky, 2017; Burke 2000). This is particularly notable within the English language, and of particularly prolific occurrence during the mid-18th Century, due to the rise of codified taxonomic conventions and the concurrent impetus for encyclopedic surveyorship with detailed categorization (Burke, 2000). While the categories are arbitrary and the delineations are formed of criteria determined seemingly at whim, the sorting scheme of the mystic zoological diaspora came into being as part of this explosion of scientific specificity and the predominance of English reflects the increasing dominion of the British Empire over an astronomical number of aspects in daily life for the average person regardless of national origin.
            Sticking exclusively with vampires, however, it must be noted that while the case studies of Blagojević and Paole do somewhat resemble our modern pop-culture vampire, they do so only in the broadest strokes. Frombald’s report on Blagojević records the corpse as ‘completely fresh’ and notes that traces of blood were observed about his mouth, apparently freshly sucked from the victims—despite the fact that Blagojević’s victims reported that he’d attempted to strangle them (Wilson, 2020). The earliest mention of a true vampire’s bite, with the bite itself as being the primary (though not yet exclusive) source of transmission for the vampiric condition, is the Flückinger report on Arnaud Paole, wherein the surgeon’s secondhand account describes that Paole, while alive, had bragged about his clever means of protecting himself from a vampire bite, suffered while serving in the army abroad in Kosovo, by consuming dirt from the vampire’s grave and smearing himself with its blood (Doughty, 2021). Like Blagojević, Paole’s corpse appeared fresh and had smears blood around his mouth and nose—despite, also like Blagojević, having been reported to strangle his victims rather than bite them (Doughty, 2021). Both Blagojević and Paole were staked through the heart, though Paole was additionally decapitated and burned (Wilson, 2020; Doughty, 2021).

​
            While the means of dealing with a vampire, once identified, fits mostly within our modern picture of a vampire, the trifecta of staked, decapitated, and burned serves to kill just about everything that goes bump in the pop-culture night. Many cases of vampires from the European panic didn’t even require such involved disposal procedures—plenty of vampires were simply re-interred after having their bones ceremonially rearranged, often by having their skulls or feet repositioned and inverted (Bell, 2014; Groom, 2018). Furthermore, the vampiric strangling of victims is certainly a bit of a swerve from our modern pop-culture picture. The list of identifiers used to find a vampire is also peculiar. While a corpse that doesn’t appear to decay and seems to have fresh blood at the mouth does suit the modern picture, the sum total of the qualifiers list is rather light to modern eyes.

            What makes a modern vampire is a much longer list of attributes. Stoker’s own notes pull in a couple of significant details, such as the fact that vampires have no reflections (or potentially even shadows), must be carried or led across thresholds into a residence (i.e., with direct consent), never eat or drink normal sustenance, demonstrate enormous strength, frighten traditional predators like wolves, can only cross running water if it’s tidal (and even then, only at full slack or high tide), and have gleaming white teeth. Some attributes of the vampire created within Stoker’s notes, such as how Count Dracula has control over rats and the ability to grow or shrink in size at will, don’t jive with the fully modern vampire, but those are traits that can mostly be dismissed as facets of Stoker’s own invention, and things that never managed to assimilate into the broader folk story as curated by pop-culture.
            Beyond Stoker’s notes for the character of Dracula, the modern pop culture vampire displays a few other critical traits. They have immense wealth and the grand style such wealth supports, along with aristocratic manners and aloof, worldly airs. They are youthful and beautiful, with flawless skin and perfect figures and particularly alluring features—to a degree that engenders an almost hypnotic thrall, if not simply gifted with the magical ability of mental enchantment. They have an insatiable, ravenous need for human blood. They can transfigure themselves, most typically into the form of a vampire bat. And for some reason, they’re dramatically averse to garlic.
            None of 
these features stem from the reports made by Frombald and Flückinger, but they are specific and ubiquitous enough that they must have come from somewhere to have so thoroughly pervaded the modern understanding.

American Exceptionalism

            The answer to where exactly the modern features first developed can be found in the isolated hamlets of the American Northeast. New England had its very own spate of vampire panics, beginning just shy of a century after the uproar began in Eastern Europe. Beginning with Sarah Tillinghouse in 1799 and ending with Mercy Brown in 1883, the American Vampire Panic was rather short lived (Doughty, 2022). It was also contained to the rural, predominantly white communities of Rhode Island, Maine, Connecticut, and Pennsylvania with a smattering of other incidents within the confines of New England (Bell, 2013). Despite the apparently limited scope and scale, however, the American Vampire Panic contributed tremendously to our modern vampiric ideal.
            Within America’s rural, northeastern communities, the ‘intrinsic Other’ Sapolsky discusses as the contingent of a group most likely to be designated as blame-holder in times of strife wouldn’t have been the traveling merchant or the retired soldier. Merchants were the lifeblood of such communities, and solders were still predominantly viewed as the protectors that allowed the fledgling Republic to remain free of England’s royal rulership. Instead the ‘Other’ in these communities would have been the odd person who could afford to have European tutors brought in to personally school their children, as was the case of the Brown Family in the 1880’s (Bell, 2013). This shift of this automatic blame to those with posh, Europeanized upbringings is likely the bulk of what has contributed to our modern view of vampires as wealthy, stylish, and puffed up with aristocratic mannerisms3.
            Contact with American First Nations added rejuvenating spark and new flavorings to the run-of-the-mill revenant of European vampires. The plethora of tribes present in New England, such as those of the Iroquois Confederacy, the Mi’kmaq, the Algonquin, the Wampanoag, and the Mohican tribes, presented an array of stories with overlapping details that were primed for intermixing with the new Americans’ old stories of the vampire. The Wendigo stories are perhaps the most well-known to modern American audiences, having featured in some variation within everything from tv shows like Buffy the Vampire Slayer (1997-2003) and the CW’s Supernatural (2005-2020), to movies like Ravenous (1999) and Blade (1998) (and all their spin-off media). The Wendigo is a mythological creature that, as best we can tell, originates with Algonquin-speaking First Nations Tribes, such as the Woodland Cree and the Ojibway, and it stands as a supernatural evil bent on the consumption of human energy, flesh, and lifeblood as it haunts the night with an insatiable hunger and indelible cunning that cannot survive on, or even tolerate consuming, other forms of sustenance such as bread, fruit, milk, or water (Lenhardt, 2016). Prior to the appearance of American Vampires, there appears to be no mention of such ravenous desire for human anything in the vampire lore—the European vampires appear to have terrorized their communities for more petty and nebulous reasons such as personal revenge or generic, Satan-spurred evil-doing, if for any reason at all (Groom, 2018). The cunning of the Wendigo also seems to have added the concept of keen intelligence, compounding the notion of worldly, learned-ness brought in via the othering of those educated by European tutors.
            Beyond Wendigos, the lore of First Nations Tribes is peppered with creatures that resemble vampires, with far more attributes that fall in line with the modern picture than does the image as presented exclusively by the European vampires. The non-profit behind native-languages.org specifically details several vampire-like creatures specific to the New England region, including the Chenoo and the Giwaka (both of which are sometimes considered Wendigo variants), along with the legendary Rolling Head (with the powers of flight and exclusive weapon of a bite), the Skadegamute or Eastern Two-Face (with super-strength, nocturnal habits, and hypnotic thrall), and Skin-Walkers4 (with transfiguration abilities, hypnotic powers, stark intolerance of normal food leading to a ravenous thirst / hunger for human-sourced nutrition, and daylight corpse-state).


​            Furthermore, First Nations legends extensively represent streams and rivers as uncrossable barriers to supernatural creatures and evil spirits of all kinds. Robert Hall explores all sorts of sacred barriers, but he lingers most obviously the concept that such entities “would not cross a stream […] ‘no matter how small’,” as reported by Alice Fletcher and Francis la Flesche in 1911 (Hall, 1976, p. 361). The running water element of modern vampires seems to derive from this First Nations belief, including the manner in which native lore holds that most lakes and still waters are home to disquiet spirits rather than serve to protect humans from them. The ability of such creatures to cross tidal reservoirs at extreme high and low tides may derive from a number of fishers’ tales, of both European tradition and American invention, possibly with the additional influence First Nations’ seaside tales, though none I could find were isolated or exclusive enough to conclusively determine any direct interaction.
​
            All of these elements infuse the vague whispers of the European vampire with the specific traits and particular taxonomic specificity familiar to us in the modern pop-culture picture of a vampire.
            Additionally, while in the midst of categorically blaming a well-off, Europeanized upbringing for entreating vampirism, and shamelessly poaching local lore, American townsfolk pulled bits of European lore, vampiric and otherwise, apparently at near random, and simply attached it to the new taxonomically discrete entity of the vampire. The aspect of soulless creatures lacking a reflection can be found in folklore from all over Europe and even well beyond Europe, from the Swedish tradition of turning away mirrors, through all manner of coverings, veils, and drapery present in Japanese funerary traditions. Likewise, garlic has been a repellant of all manner of unspecified evils from the early days of Egypt—by 3700 BCE, it was a staple in both medicinal employ and religious ceremony (Petrovska, et al., 2010). And being that garlic is a natural anti-coagulant that contributes to the removal of fat from the human bloodstream, with an ability to genuinely provide immune support and well-documented antibacterial properties, it’s plausible that copious use and consumption of garlic legitimately reduced the odds of vampiric observation by improving the health of communities and simultaneously diminishing the chance of an uninterred corpse experiencing retarded putrefaction.
            Garlic could not completely remove the risk of vampires, however, as the frigid ravagings of the Little Ice Age were still in full swing as late as 1883. Records from Pennsylvania, from the time leading up to the majority of American vampire incidents, paint a foreboding picture of frankly miserable winters, with horse drawn sleighs regularly crossing the solidly frozen-over Delaware River from December to nearly April, occurring at sporadic intervals amid more normal weather patterns (Pennsylvania Weather Records, 1644-1835).
            And, as if the presence of what factors instigated the European Vampire Panic wasn’t enough to catalyze an American Redux, 19th Century America was facing a new wave of experimental innovation that bred a mistrust of science as a whole and encouraged a doubling down of faith in folk traditions. This aspect of the American Vampire Panic is the critical final piece that introduces the truly modern vampire by way of integrating the lore with the fear of the newfangled science of embalming corpses.

The Historical Development and
Impact of Chemical Embalming


            The modern practice of chemically embalming dead bodies began to take shape in the 1780’s, though it began more as a science experiment than as a shift in spiritual or cultural practice. In fact, it appears that as part of the effort to reduce over-crowding within urbanizing areas, there was a “rural cemeteries” movement that connected the body more thoroughly to nature and the natural process of decay than ever before, especially among Protestant communities of the American Northeast, which effectively ostracized the concept of embalming before it was even properly introduced (Laderman, 1996). While much of Europe began to integrate chemical preservation of a body into funerary practice, stark resistance flared up within the entire spread of American settlements and reliance on “the simple rudiments of refrigeration” lasted well into the Civil War (Habbenstein 1962, p.316). This resistance appears to stem largely from the purity demanded of a “proper Christian funeral”, considered tantamount to living without willful sin in regards to granting permission for the deceased to enter Heaven. Within the rural American communities most affected by the Vampire Panic, up to 85% of the population was considered ‘unchurched’ by Puritan standards, and the hybridization of Christian values with ‘folk superstitions’ leveraged a uniquely dire weight to the propriety with which rituals were accomplished (Bell, 2006). The need for a truly “proper” funeral supported by purity and piety combined with the newfound impetus to understand decay as Godly Nature at work, as being forcefully promoted by city officials alleviate urban over-crowding. Together, these motivations reinforced the idea that chemical preservation was a corruption of the flesh—though, this idea would later be spun by entrepreneurial embalmers to convince the public that rot was the actual corruption of the flesh, and that “without any mutilation or extraction” professional embalming via proprietary chemical injection would “preserve bodies forever, or at least as long as stone”, as advertisements by Dr. Thomas Holmes regularly bragged, using the natural imagery of stone to muddy the water of resistance while leaning hard on sentimentality (Habbenstein 1962, p.324, p.330). Likewise, the concept of “proper” was spun away from the notion of ‘pure’ and into a concept of providing adequate pomp and circumstance by unscrupulous undertakers and proto-funeral directors angling to squeeze a bit more cash out of their grief-addled customers.
            But even without religion skewing the perception, the process of embalming a corpse was new, and ‘new’ almost universally equivocates to ‘scarry’—especially as some of the early embalming experiments did not go well. The deleterious results were regarded by the pioneers of embalming science as mere embarrassments, but to the general population, such failures reinforced a fear that was already prevalent regarding how undertakers were a dubiously honorable lot and were often considered to be little better than body-snatchers in that they could (and almost certainly would) profit from both ‘care-taking’ of a family’s loved one and also providing study materials to the very abusive anatomists that an undertaker was hired to protect a body from (Scandura, 1996, p.2). The very entrepreneurial spirit that endeavored to spin chemical preservation away from ‘a corruption of the flesh’ and into ‘a preservation of God’s perfection’ bred a deep mistrust of anyone peddling specialized skills of deathcare.
            This mistrust led to a unique rise in the importance of preemptory agreements and contractualized obligations. As Paul Fitz comments, “from its inception, the under-taking trade was a solely profit-driven enterprise”, and in order to ensure those profits were made efficiently, detailed, itemized contracts were created and signed upfront; such contracts could be brought to bear in Court if disgruntled clients refused to pay—with rulings generally favoring the undertakers (Fitz, 1994, p.249; Scandura, 1996. p.5). Contracts such as these became more prominent in the realm of undertaking than in almost any other industry of the era, particularly as most comparable ‘learned’ and science-based occupations, particularly those involving human bodies such as doctors and surgeons, were considered vocations almost entirely “removed from the crass sphere of capitalist trade” (Scandura, 1996: p.5). 



​           These contracts, in addition to explicitly and overtly linking the undertaking trade to capitalist endeavor, removed full legal control over the deceased’s body from the hands of the Family, breeding the fear found within any such loss of control with the pervasive distrust of unscrupulous mad scientists, and allowing the thought of ‘something evil has occurred’ to fester within grieving minds. When combined with all the other triggers present to incite a Vampire Panic, it seems to be this straw that tipped things over into a fully-fledged firestorm of fear.
           Such contracts are plausibly the root of the modern vampiric trait of requiring consent to enter a proprietary residence, as no other link to vampires requiring permission to cross thresholds bears up under any scrutiny. It was not until contracts became so irrevocably prominent as they did in the midst of the American Vampire Panic that any mentions of permissions were attached to the vampire’s taxonomic lore. Stoker’s own notes for Dracula, apparently jotted down around 1891-4, only reference the permissions in passing, using language that echoes reports from the 1884-92 accounts of the young Miss Mercy Lena Brown’s exhumation due to her vampiric condition (Stoker, et al., 2008; Miller, 2006). These accounts come from papers that picked up the story and carried it all around the world over a number of years following the incident, but the earliest reports like those from the Patuxent Valley Gleaner, the Providence Journal, and even the Boston Globe reference the locals’ vague beliefs that Mercy-the-Vampire was being carried home or led back to the Brown Family by Mercy-the-Person’s blanket permission to enter the Family home, often with tense references to explicit permissions being granted to undertakers in both burial and then the subsequent disinterment and reinterment of her body. The influence is clear in how Stoker’s character of Lucy is almost certainly linked to Mercy, being that ‘Lucy’ is a portmanteau of ‘Mercy’ and ‘Lena’, and it seems quite apparent that her legend influenced a great deal of what became Count Dracula and, in turn, evolved into our modern vampire (Miller 2006; Doughty 2021).

            Aside from the potential of catastrophic failure in attempted embalming, and the general mistrust of undertakers as being any kind of legitimate, honorable professional, chemical embalming was purported to render a corpse ‘perfect’, with bright white teeth and supple, flawless skin, and keep a body almost more beautiful than a person had been in life—which was considered creepy on its own among communities that understood decay as a natural, normal process (Laderman, 1996). This pervading fear at the artificial perfection of the dead body is almost certainly what gave vampires their alluring beauty, and the replacement of blood with toxins is likewise what gave vampires their venom, rather than a simple thirst that might drain a victim without any chemical-like blood-replacement—and the exact placement of an embalmer’s cut at the jugular gave the vampires’ thirst a target to bite (as opposed to the blood-drinking vampires of the animal kingdom, which tend to bite victims at the wrists, knees, thighs, and ankles) (Scandura, 1996, p.9, p.15). Particularly when infused with the local stories of Skin-Walkers existing as apparently normal corpses during the day and becoming cunning predators at night with the power of hypnotism to enthrall an unsuspecting victim5, this enchanting hyper-perfection became the stuff of nightmares. While Stoker’s final descriptions for Dracula lean more heavily on the European imagery of a waxy, grey-skinned creature, the 1890’s were the height of the embalming controversy and this newly catalyzed fear of perfection in a dead body may not yet have felt old enough to Stoker to truly suit his perception of ‘authentic’ lore—he may very well have dismissed it as a modern projection over the lore referenced in George Stetson’s 1896 paper on “the Animistic Vampire in New England” for American Anthropology in reaction to the Mercy Brown case’s rising notoriety (a paper which simultaneously disparaged the Rhode Islanders’ belief in vampires while ironically legitimizing the concept of older ones, if only as an attempt to understand the world pre-science).

Conclusions & Caveats



            In conclusion, there is a perfectly rational explanation for why a few cases of panic regarding the potential reanimation of corpses intent on murdering their fellow villagers occurred in the urbanizing small towns of Eastern Europe. Likewise, there’s a direct correlation between the rise of mass media and the concept of encyclopedic study that helped turn the Serbian word to describe a monster ‘back from the dead’ into the unique entity of the vampire within the monster indexes of English speaking communities. Furthermore, that European idea of the vampire as a discrete entity and a unique creature was straightforwardly adapted and transformed by its transition to an American context. Throughout the evolution of the legend, the presence of the press, the new emphasis on science and specificity, and the categorical dominion of the English language all contributed to the sculpting of a nebulous, folkloric musing into the solidified, well-defined entity of modern pop-culture.
            However, before we patently accept that the American Vampire Panic took the almost unrecognizable lore of the European vampire and transformed it so totally into what we now think of as the definitive version of what a vampire is, we must address the fact that the American Vampire Panic was actually quite limited in scope and scale. The maximum stretch possible to consider as the duration of the American Vampire Panic was less than 90 years, and well before the end of it, the communal fear had transfigured into a few overly-loud fear-mongering voices and a resigned willingness to simply placate those badgerings of annoying neighbors. In a very “ok, Boomer” sort of snide pantomiming, George Brown allowed for the exhumation of his daughter Mercy, but he didn’t attend it—and Dr. Metcalf was an outspoken non-believer who only oversaw the procedure to ensure that a semblance of respect was maintained throughout (Doughty, 2021). The American Vampire Panic was also extremely limited in terms of its location and direct-spread influence. The area affected by the Panic was hardly enough for the incident to be named for all of America, it was only relevant to barely even 10% of the landmass—even less of the population.
            If the American Vampire Panic was truly so instrumental in creating the modern vampire as the entire world knows it today, why wasn’t the panic that catalyzed such a resoundingly significant transfiguration more widespread than a few dozen rural towns in backwater New England?
            There are several factors that come into play in answering that question. The first, and potentially most prominent, is that the majority of American landmass was decidedly too warm for many of the triggering factors of a vampire panic to take root. Without the full effects of the Little Ice Age, most of the pressures that initially made people start looking for vampires simply were not present. Nor were the atmospheric conditions conducive to creating a strange delay in decay via de facto refrigeration. Additionally, America was a big open space of endless possibility—including the possibility for burying bodies with plenty of room to breathe. The over-crowding of cemeteries that led to longer waits for proper interment and therefore easy comparisons of decomposition rates simply wasn’t a factor in all but the oldest and most tightly knit of rapidly urbanizing towns.
            Beyond that it was simply warm enough to hasten putrefaction in the South, vampires were not present in any southern communities in part due to the lack of flexibility in the scaling of the Us/Them-ing structure—with a rigidly defined ‘Us’ equating to the Family alone, with perhaps only a few close friends, all white, rich, and land-owning, ever allowed within the tiny circle. It is very hard to contract a cooperative ‘Us’ circle, as Sapolsky outlines, if the circle is already so small. Furthermore, bodies of Southern communities were most often buried on Family land, and even if embalming did come into the equation, the practice of deathcare often needed to remain within the Family’s view for the simple reason of space-requirements. While over-crowded cities required bodies to be removed from the home to have adequate space for embalming, the large areas of southern plantations and adjacent-estate ‘towns’ meant that it was more efficient for embalmers to do their work where the body was already located—which presumably allowed for Familial control and oversight to be maintained to a more substantial degree than in the more urbanized areas (Laderman, 2003).​

            Even with accepting that the Panic was nearly guaranteed to be isolated to the rural Northeast, there are still a significant number of communities that theoretically should have been affected by the firestorm of fear: the free Black communities. African Americans in the Antebellum North were better off than those enslaved (or even Free) in the South, but they still faced tremendous hardship and resistance as they worked to achieve success in any number of ventures. The rise of the African American Funeral Home was something to be celebrated in these communities; it came about as part of a social push towards asserting their right to gather and be visibly well-off and independent (Cann, 2020). The pomp and ceremony of a grand funeral also allowed for Black Families to achieve a level of classy notoriety and respect in death that was often denied to them in life, even among communities in Free States (Cann, 2020). Furthermore, as plenty of such Black-owned funeral establishments were already in the midst of breaking social dictums—such as how the first funeral home in Philadelphia was run by Henrietta Smith Bowers Duterte in a time when even white women could barely own property, let alone own and run a business—the introduction of embalming innovations were seen more as Black Excellence finally being recognized in mainstream science than it was a mad experimenter’s means of supporting a body-snatching proclivity (Doughty, 2018; Cann, 2020). This proactive determination in Black communities to see funerals and deathcare innovations as a triumph mixed with the backlash of African Americans so often being on the outside of contracting Us/Them circles among white Americans, leading to far more unified and cohesive cooperative group structures within communities that made irrelevant what factors would generate the sort of pseudo-speciation required to ignite a vampire panic.
            As a backdrop to all of this, the turn of the 20th Century was just around the corner, bringing with it a fully-fledged version of modern science which made it difficult for folkloric superstition to seize hold of communities. Additionally, with the modernization of medicine and the more self-aware / premeditated urbanization of communities as planned cities began to spring up, death was beginning to move out of the home for a large percentage of the population separating the average person from an intimate understanding of death and exacerbating the wrongness felt in allowing a loved one to decay (a perspective being pushed hard by the rising Funeral Industrial Complex) (Laderman, 2003, p.1). And with the repercussions of the American Civil War, the landscape underneath the Vampire Panic had shifted dramatically. Embalming, due to the sudden need to preserve bodies long enough to send dead soldiers home to be buried, was suddenly not half as terrifying as it once was, though it maintained a decided slant of the uncanny for another decade or two (Laderman, 2003, p.6). The very real existential crisis the Civil War presented to the American Union made silly little things like vampires seem a lot less threatening, while simultaneously providing ample reason for frightened Americans to want to fixate on literally anything else besides the war and the debacle of politics around it. The wider public’s engagement with the vampire was plausibly as much a broad-scale push of displacement aggression as it was a genuine panic triggered by legitimate belief. Robert Sapolsky repeatedly expounds how displacement aggression, even in the form of light communal mockery of a small contingent of superstitious backwater outsiders, can reduce stress to the point of literally preventing ulcers in subjects experiencing high anxiety. Therefore, newspapers were encouraged by continuous public engagement to write yet more about the American Vampire Panic, which spread the tale further and further, which codified the lingering vagaries of the European myth into a truly distinct taxonomic entity complete with all the features that we know of pop-culture vampires today.
            It is because of the American Vampire Panic’s proximity to true modernity and the presence of accessible mass media that it was both, quite so limited and short lived, and yet also so dramatically well-publicized and widely influential. The narrow limit of the Panic’s actual direct impact is plausibly the very thing that allowed the notoriety of its story to grow and spread with such exponential speed. The pop-culture vampire of our modern imagining is derived directly from the historical progression of tales of the undead from Serbia through Europe and over to America, with each step in the journey providing key elements of the mythic creature’s modern identity as a discrete and monstrous species.

 
Notes: 
  1. Stoker’s assumption of ‘Dracula’ translating to ‘evil’ or ‘devil’ from the Wallachian language may have been based on a genuine connotation of the word, due to the unfavorable association with Vlad the Impaler, but no reasonably accurate direct translation viably establishes the denotation sketched Stoker’s notes.
  2. By the time the best efforts of medical science had been exhausted and people within communities experiencing a vampiric affliction were willing to attempt superstitious means of addressing the threat, the life-cycle of a plague would already have been nearing its natural end as a given population developed herd immunity. Therefore, the wide array of anti-vampire actions taken (covering everything from rearranging a suspected vampire’s bones to burning their remains and feeding the ashes to the afflicted) could have been easily linked to the apparent abatement of disease, confirming the presence of a vampire threat despite the inconsistencies used to resolve the threat. The exact methods used were less relevant to spreading of the story than the fact that implementing anti-vampire actions actually worked (Bell, 2013; Doughty, 2021).
  3. The modern view of vampires as wealthy, stylish, and ingrained with aristocratic mannerisms has evolved as vampires have been ‘domesticated’ and reframed as sexual temptation in the last decade or so, allowing the association with wealth to create an ever more intricate presentation of sex-appeal in a positive feedback cycle as shown within the publications of Anne Rice, Stephanie Meyer, & Tracy Wolff. There is plenty of scholarship on this domestication / sexualization process, as vampires shifted from monster to love-interest, but delving into the nuances of the shift requires more of a focus on the cultural/moral shifts of the 1980’s than the scope of this paper is intended to cover. The presence of the vampire as monster in 2022 is still significant enough that the presence of the vampire as love-interest is conclusively a cultural re-framing of morality rather than it is any substantive change in the taxonomic definition.
  4. The concept of Skin-Walkers in American First Nations Lore is not to be confused with the Irish Skin-Walker of Faerie origin, though there is a potential connection between the two (and a likely conflation of the stories) that as discussed later in this paper cannot be fully investigated given the scale of this investigation.

 
 
References Cited: 
  • Almulhim, Abdulaziz M., and Ritesh G. Menezes. “Evaluation of Postmortem Changes.” In StatPearls. Treasure Island (FL): StatPearls Publishing, 2022. http://www.ncbi.nlm.nih.gov/books/NBK554464/.
  • Bell, Michael E. Food for the Dead: On the Trail of New England Vampires. Wesleyan University Press, 2013.
  • ———. “Vampires and Death in New England, 1784 to 1892.” Anthropology and Humanism 31, no. 2 (2006): 124–40. https://doi.org/10.1525/ahu.2006.31.2.124.
  • Burke, Peter. “Classifying Knowledge.” A Social History of Knowledge, From Gutenberg to Diderot, 2000, pp. 81-115. ISBN: 978-0-745-62485-3.  GMU Fenwick Stacks.
  • Cann, Candi K. “Black Deaths Matter Earning the Right to Live: Death and the African-American Funeral Home.” Religions 11, no. 8 (August 2020): 390. https://doi.org/10.3390/rel11080390.
  • Cohen, Jeffrey Jerome. “Monster Culture (Seven Theses).” In Monster Theory, edited by Jeffrey Jerome Cohen, NED-New edition., 3–25. Reading Culture. University of Minnesota Press, 1996. https://doi.org/10.5749/j.ctttsq4d.4.
  • Doughty, Caitlin. America’s Forgotten Vampire Panic. Ask a Mortician, 2021. https://www.youtube.com/watch?v=teFCP69trOI.
  • ———. ADIPOCERE Aka CORPSE WAX, Ask a Mortician, 2018. https://www.youtube.com/watch?v=gi0Gi0sqXwg.
  • Fitz, Paul S. “The Undertaking Trade in England: It’s Origins and Early Development: 1660-1830.” Eighteenth-Century Stiudies 28.2 (1994-95), pp. 241-53.
  • Groom, Nick. The Vampire: A New History. Yale University Press, 2018. https://doi.org/10.2307/j.ctv6gqxp2.
  • Habenstein, Robert Wesley. The History of American Funeral Directing. Milwaukee, W.I. : Bulfin Printers, 1962. http://archive.org/details/historyofamerica0000habe_q7s4.
  • Hall, Robert L. “Ghosts, Water Barriers, Corn, and Sacred Enclosures in the Eastern Woodlands.” American Antiquity, vol. 41, no. 3, Society for American Archaeology, 1976, pp. 360–64, https://doi.org/10.2307/279525.
  • Knight, Jasper, and Stephan Harrison. “Mountain Glacial And Paraglacial Environments Under Global Climate Change: Lessons From The Past, Future Directions And Policy Implications.” Geografiska Annaler. Series A, Physical Geography 96, no. 3 (2014): 245–64.
  • Laderman, Associate Professor of American Religious History and Culture Gary, and Gary Laderman. Rest in Peace: A Cultural History of Death and the Funeral Home in Twentieth-Century America. Taylor & Francis, 2003.
  • Laderman, Gary. The Sacred Remains : American Attitudes Toward Death, 1799-1883. New Haven : Yale University Press, 1996. http://archive.org/details/sacredremainsame0000lade.
  • Lenhardt, Corinna. “Wendigos, Eye Killers, Skinwalkers: The Myth of the American Indian Vampire and American Indian ‘Vampire’ Myths.” Text Matters: A Journal of Literature, Theory and Culture, no. 6 (2016): 195–212.
  • Miller, Elizabeth. “Back to the Basics: Re-Examining Stoker’s Sources for ‘Dracula.’” Journal of the Fantastic in the Arts 10, no. 2 (38) (1999): 187–96.
  • ———. Getting to Know the Un-Dead: Bram Stoker, Vampires and Dracula. Brill, 2006. https://doi.org/10.1163/9789401201469_004.
  • Native American Vampire Characters of Myth and Legend (American Indian Vampires and Vampire-Like Monsters). http://www.native-languages.org/native-vampires.htm. Accessed 11 May 2022.
  • “Pennsylvania Weather Records, 1644-1835.” The Pennsylvania Magazine of History and Biography 15, no. 1 (1891): 109–21.
  • Petrovska, Biljana Bauer, and Svetlana Cekovska. “Extracts from the History and Medical Properties of Garlic.” Pharmacognosy Reviews 4, no. 7 (2010): 106–10. https://doi.org/10.4103/0973-7847.65321.
  • Sapolsky, Robert M. Behave: The Biology of Humans at Our Best and Worst. Penguin, 2017.
  • Scandura, Jani. “Deadly Professions: ‘Dracula,’ Undertakers, and the Embalmed Corpse.” Victorian Studies 40, no. 1 (1996): 1–30.
  • Stetson, George R. “The Animistic Vampire in New England.” American Anthropologist 9, no. 1 (1896): 1–13.
  • Stoker, Bram, Robert Eighteen-Bisang, and Elizabeth Miller. Bram Stoker_Ñés Notes for Dracula: A Facsimile Edition. McFarland, 2008.
  • Wilson, Karina. “Decomposing Bodies in the 1720s Gave Birth to the First Vampire Panic.” Smithsonian Magazine, October 23, 2020. https://www.smithsonianmag.com/history/decomposing-bodies-1720s-gave-birth-first-vampire-panic-180976097/.

0 Comments

Ukraine || Наша Україна | Global Solidarity vs Ethnocentric Pain, a Breakdown

3/7/2022

0 Comments

 
What's happening in Ukraine is an absolute travesty of the very worst and very best Humanity can offer. Ukraine is rising up and standing strong in the face of inconceivable terror. Russia's act of invasion is vile and horrible and nothing but pointless, arrogant, unbridled aggression perpetrated against a nation perceived to be weak. Make no mistake, however, is this Putin's war. And it has been his war for more than 2 decades.

This IS WW3. 

And it has been for a while, just playing out in slow motion. (More on that later.)
To start with, I want to address the most divisive issue

Why Ukraine?

Why has the world chosen to rally so dramatically around Ukraine? It cannot be ignored that blatant racism does indeed play a part in why Ukraine is striking such a chord in global media when there's such horror as exists in Yemen, in Syria, in Afghanistan, in South Sudan, among the Hmong in China, etc.
Picture
Picture
According the the Global Peace Index (which I don't think has actually been fully updated for the whole actively on-going nuclear-adjacent war thing), there's a solid 20 countries in the world worse off than Ukraine for active State violence and humanitarian horror stories. But Western Media doesn't seem to care.

What's happening with Israel and Palestine is particularly grievous, in my opinion, because the State's violence is essentially just endorsed by the EU and NATO (and near-explicitly encouraged by the US). That Irish MP had it absolutely right when he ranted about how utterly hypocritical it is to feel such overwhelming empathy for Ukrainian refugees and victims of violence while just condoning the horrific cruelty of Israeli treatment of Palestinian civilians. Aljazeera has several great articles on how this conflict has exposed a startling degree of straight up vile ethnocentrism (like this one, or this one), but I don't think it's really so simple as the West is hella racist.

I think it's very hard to talk about without sounding racist because most people, even educated people and journalists, have never had to sit down that think about why Ukraine and why not Palestine. I want to make it VERY clear that racism is a BIG part of it, but I also want to ensure that my fellow Westerner don't just succumb to defeatist thoughts like 'realizing' that they themselves are more racist than they thought.
​
There IS a valid, non-racist reason that Ukraine matters to the world in a way Yemen doesn't.
Actually there's three.
Reason #1 : Shared Cultural Heritage.

This is the closest to just-plain-racist idea, but anthropologically, it bears out as legitimately non-racist and not simply ass-covering for a racist argument.
Western Media and Western nations have been invested in propagating the Western Canon of art, language, literature, music, architecture, etc for centuries. The West is the West because of an inflated and artificially reinforced self-assessment of being from a shared and glorious historio-cultural background. Beyond the fact that we do 'look like them', there's a shared religion that's either explicit the State religion or simply silently, but institutionally endorsed by the State (ie, all the US federal holidays being Christian). We value the art and buildings being destroyed in Ukraine because we've had 2 centuries of indoctrination that such buildings are Historical and Art and have Innate Cultural Value, in contrast to the Temples in Jordan and Turkey and Afghanistan (which we can logically conclude have value, but such hasn't been spoon-fed to us for generations).
We've been convincing ourselves that Europe is essentially composed of One People, a single large Family, with the US, Canada, and Australia as Europe's next door neighbors since about the 16th century. We've only been actively trying to start including the rest of the world in our estimate of that Family since like the late 1940's with the founding of the UN... And really, that wasn't an altruistic endeavor, it was a wow, WWII was bad and it was totally, obviously preventable, lets try to prevent the next one...

Picture
Picture
We've only really been One World, One People since like... idk, the mid 2010's? I distinctly remember the 2008 Olympics in Bejing when commentators were discussing things like a new sort of cultural together-coming vs the old purely nationalistic pride thing and athlete performance then was still a direct reflection on national power rather than this newfangled 'it's the Sport that matters, not the Flag' strangeness from the last few (historically, the Olympics have been demonstrations just shy of active warfare, soooooo this whole 'for the Sport' thing is real new).

Anyway, back on point.
Why Ukraine and why not Yemen?

Because the West has over 4 centuries of institutionalized, pervasive, aggressively reinforced empathy with Ukraine and has maybe 20% of that time learning to empathize with Yemen--and that time has been spent under far less aggressive efforts, that do not include institutionalized doctrines of State-endorsed empathy. Acknowledging that is NOT racist, not inherently. In terms of neuropathy, watching a bomb hit Ukraine is like watching someone kick a dog--emotionally untenable. Whereas watch a bomb go off in Yemen is like watching someone kick a baby crocodile--Yes, we know, logically, that it's still animal abuse, and that technically crocodiles are more at risk for extinction than dogs, but that logical argument doesn't make and difference when the fact is that watching a dog be kicked just hurts more.

We ARE trying, as a human species to achieve universal empathy, but we aren't anywhere near actually accomplishing that yet, and that isn't a criminal offense. Hypocritcal? Yes. But worthy of such derision and venom as the Internet's dredged up? No.
  • (Though, I totally understand where the fury is coming from, particularly as those with ties to places like Yemen and Syria and Palestine watch the world unite behind a cause that is not them and when their suffering is just as grievous as Ukraine's.)

So, that explains a bit of why the common western citizen care so much about Ukraine.

But why does it seem like governments also care so much more about Ukraine than about Palestine?
The reason for that is two-fold, one is pragmatic to the point of cringe-y and the other is only slightly less so...
Picture
Picture
Reason #2 : Resource Dependence.

We've all noticed the inflation, and the rising gas prices around the world. The governments of the world care more about Ukraine than about Yemen because Ukraine has stuff we want and is being invaded by a country that also has stuff that we want. In terms of resources that Europe and the US want, Israel and Palestine and Yemen and even Afghanistan are so low-stakes it's embarrassing. As an economic consideration, the most significant thing about that whole miserable Israeli conflict is how many of a given congressional district's constituents have ties to Israel and how likely are they to donate to a campaign that condemns Israeli actions... Israeli-linked constituents are more likely to donate (to even be able to donate) than any Palestinian connected citizen.

While still truly abhorrent to reduce a humanitarian crisis to 'the numbers', it IS just math.

US economic concerns favor Israel. And that's only true when directly confronted on the issue or during an election cycle. If there's no election cycle in play, the US basically doesn't care at all, economically speaking.

But Russia and Ukraine? Between the raw metals, the natural gas and oil, and the heavy machinery manufactured in both Ukraine and Russia (not to mention the Vodka and such smaller scale imports), the US and Europe are in deep with Russia and have been for decades. The governments of the world care a lot more about this conflict because it's a helluva lot more genuinely impactful to them.
That's a tragic statement, but it's not racist to say that losing Russia's oil imports affects Germany FAR more than any nonsense in Yemen ever could. NORD Stream 1 and 2 were each multi-billion dollar, long term investments ($8.8 Billion and ($9.5 Billion, respectively). And Germany just wrote them off forever in the last few days. NORD Stream 2 wasn't even ever turned ON yet, it's an investment that was supposed to begin making up for the cost of itself this year, just gone out the window as a debt that will never manage to make up for the loss needed to build it. That's a whopping 20 BILLION dollars in taxpayer money just poofed out of play.
Yemen's direct economic impact on Germany? *shrugs* It's barely a footnote, with Germany dropping an obligatory 100-something million dollars (approximately half of 1 percent the cost of just the NORD Stream Projects), given as humanitarian relief, with zero other economic connection between the two countries at all.

So, seriously. Tragic as that distinction is, it's not 
racist, it's evil, psychopathic capitalist.
And the third, most objectively significant reason for Government attention:
World War Three.

If any of you reading this imagine for just one nanosecond that Putin's war is going to stop with conquering Ukraine, you might wanna sit down and grab a comfort blanket.

Putin has been trying to reestablish the old borders of the USSR (and the expand them) since before he officially came to power. Masha Gessen relates in clear, brutally straightforward terms how the Putin's 'Forever Wars' aren't going to stop and haven't been effectively thwarted by any Western power yet (appeasement is essentially permission, in his mind). From his early 2000's crackdown in Chechnya, and utter refusal to relinquish USSR control of the territory, to his 2014 illegal annexation of Crimea and his 2016 string of war crimes in Syria, right up to this current nonsense ploy of 'denazification' in Ukraine, Putin wants to destroy the rest of the world.
NATO is on alert because NATO is under threat. If you notice, the West is not actually helping Ukraine. Not really. It's sending some money and a couple of weapons. And it's putting a couple of scary sanctions into play... But NATO's got 8,000 troops just chillen out in Poland right now (the 5,000 usually there plus 3k more from the US alone) while just over 100 miles away, civilians prepare to face Russian soldiers as Lviv burns...

[Do you have any idea how effectively Putin's war here could be STOPPED if Zelenskyy had 8,000 more troops to work with?]
​If Putin eventually crushes Ukraine, he will move on to somewhere else. This is WW3 already in motion, just playing out more slowly than Hitler's version of the very same game. But Putin's not going to be distracted by ethno-cleansing. He's not going to devote resources to an organized removal of undesirables. He's just going to bomb all citizens unilaterally until he controls enough to the world to declare himself god.

Personally, I think NATO needs to intervene NOW.

Because we may as well help Ukraine instead of just watching it burn when it is inevitable that Putin will not stop there. Putin has already declared that the sanctions being levied on Russia count as declarations of War. HE already considers the West as active combat parties. Using NATO troops won't escalate it further than Putin already has.
Picture
Yes. Nuclear war is a risk.
But Putin has captured and is bombing THREE NUCLEAR POWER PLANTS (and counting) in Ukraine already, so we've kinda already hit that depressing mile marker.
We NEED to do more to intervene.
​
THAT is what makes this different than Yemen, than Palestine, than South Sudan...
None of the aggression in those places can spillover into world wide nuclear war. They just can't. The aggressors simply don't have the resources to truly threaten the current world order in any realistic, significant way.
Putin's Russia, however? THAT is WW3 already in motion.

BONUS: Reason 3.5... Hearts & Minds in a Modern World...
There IS one more crucial thing, arguably the MOST crucial thing, that makes world solidarity with Ukraine more straightforward in terms of empathy-mining than any connection to Yemen or North Korea ever could be:
Volodymyr Zelenskyy.

Picture
This is a war of hearts and minds, and Zelenskyy is the hero such a story needs to truly hit home within a fully modern, Western mindset. His consistent, eloquent, coordinated use of social media and his multi-platform English-language outreach all being threaded through a unified focal point... It's Heroics 101, catered directly to Westernized story-telling tropes, and executed with a degree of genuine sincerity and heartfelt earnestness that we haven't seen in a politician... Honestly? Since like our O.G. Washington.

His role in making Ukraine a focal point of world sympathy cannot be overstated.

Zelenskyy IS Ukraine, and the fact that he humbly thanks every single citizen who's helping the fight, and that he expresses clear and genuine gratitude for every tiny shred of aid the rest of the world gives him... It just makes it that much easier for public sentiment to coalesce around him, his people, and their plight.

There's no one leader in the Palestinian struggle that has this kind of reach or charisma or this clear understanding of how to effectively utilize social media as a tool of the State to impact the citizens of the entire world. There's no single general leading a unified people in South Sudan who can be looked to as an unadulterated hero at a glance by anyone around the world. There's no President of the Resistance in North Korea who can use their personal Twitter to touch the hearts of kids in Cali with rescued cat photos and then formally thank the King of Denmark in the same hour.

It's not a criticism of those other movements. It's simply praise for the masterful demonstration of utilizing 
all possible resources Zelenskyy has available.

So, no. It's not just abjectly racist to empathize with Ukraine and not Palestine.


There are legitimate 
reasons why it just hurts more, why we can invest more in the observable heroics, and why governments are paying so much more attention (which, in turn, allows the average person to witness so much more empathy-inducing horror)...

Should we care about Palestine, and Yemen, and North Korea?

Yes. Absolutely.
It's an utter travesty that we haven't gotten invested in stopping the evils of what inhuman violence is occurring.

But that does not detract from the reasons why Ukraine just matters so much more to us all right now. And we shouldn't make ourselves feel bad about the fact that we do care more about Ukraine than North Korea.


The people who want us to feel bad will achieve nothing but to render us ineffectual if we let them make it so. Instead, embrace the empathy and DO something.


And then hold onto that feeling and when Ukraine's crisis has resolves, DO SOMETHING ELSE. Do work to help Palestine, or Afghanistan, or Illegal Immigrants in your own town.


Selective empathy might still be a problem, but isn't it better to be selectively empathetic in a way that motivates real change than to be entirely uninvested and defeatist?


Instead of saying to yourself ' 
I don't care about XYZ ' and beating yourself up because you think you should care, tell yourself ' t's okay that I can't currently care about XZY ' and remember that it's okay to focus on one cause at a time.

As long as you hold onto the virtuous triumph inherent in that little bit of 
save the world oomph, it's okay if the only person you successfully do anything to save is yourself. You just gotta keep on going with that save the world vibe inside you. Individuals CAN change the world.
We just have to believe it can be done for it to work.
0 Comments

Shark Week 2021 !!

7/10/2021

0 Comments

 
       As you may have noticed from last year (and wow, it's hard to believe that it's already been nearly a year)... I like Sharks. And Discovery Shark Week is one of my very favorite weeks out of the year. It's certainly the most important TV event of my year. Shark Week has officially been running for 34 years now, and I've been watching for at least 20 of them. Even with the Olympics gearing up (with I will address in another post shortly), Shark Week still takes the absolute cake for me.
Picture
         Because the athletes of the Olympics are fine. People like them. They are obscenely well paid. And they get multiple chances to do their super elite sports-ing thing.
         Meanwhile, Sharks are pretty well universally feared and culled by their millions, murdered out of raw fear and cruelly butchered for psychotic levels of capitalist profit. (between the horrors of ineffectual 'protection' measures like drum lines and the vile brutality of shark fin soup, we kill WELL over 100 million sharks per year (Smithsonian).
         Conversely, you are actually NOT very likely to even encounter a shark within a 100 sqft area of ocean, let alone be injured by one.
SERIOUSLY. Sharks aren't honestly all that dangerous:
          For every single person in the world who gets bitten by a shark, 25 get bitten by New Yorkers. Yes. You read that correctly. You are 25 times more likely to be bitten by a New Yorker than by a shark. You're about 100 times more likely to go to the hospital due to something related to paint than because of a shark attack, and you're 1,000 times more likely to go because of an attempt to install a toilet-- and you're 10,000 times more likely to require emergency medical attention due to nails, bolts, and tacks (I wish I were kidding, but no, these are real stats taken from the Museum run by the University of Florida).
           500% more people DIED from bicycle accidents in Florida, alone, last year, than went to the hospital for shark encounters world wide between 1990 and 2009.

​
        More dramatically, the current sort of  'shark prevention' measures that most beaches use, like 'Drum Lines' kill everything that wanders into them including people. More people and 'nice' animals like whales, dolphins, and sea turtles were killed in Queensland, Australia, alone by drum lines last year than humans on the whole planet encountered a shark. Annually there are ~63 billion pounds of by-catch, much of it from passive 'shark prevention' measure (Oceana).
       Dying on a drum line is an awful death.
    Animals are stressed and struggling as they drown or starve over the course of several hours, sometimes even serval days to weeks. And if a diver finds a trapped animal, they cannot rescue them without potentially facing a fine upwards of $200k.
        Researchers documenting such atrocities may even be fined simply for publicizing the horrors, with film crews being placed under gag orders with $26k penalties.
​

       Also, this by-catch is basically a dinner bell for sharks. They're indications for an easy meal of newly-dead animal, drawing sharks closer to the beach than they would've come otherwise. Beaches with drum lines generally have MORE negative-outcome shark encounters than beaches without.

It's just all around awful.
Picture
Baby Humpback Whale caught, inches from the surface, in a Queensland Aus. Drumline 2018.
    There ARE legitimate shark deterrent tools. Including ones that are safe and genuinely effective that are way less expensive (and astronomically less dangerous to humans and by-catch animal), but politicians will keep supporting shark culling plan because their constituents are misinformed and pre-emptive vengeance driven...
        Two of my very favorite scholars working to change that are Dr. Neil Hammerschlag, over at the University of Miami and Save Our Seas, and Dr. Craig O'Connell at the O'Seas Conservation Foundation in Montauk, NY.
Picture
Shark Safe Barrier™ as deployed in 2019 La Réunion Island, designed largely by Dr. Craig O'Connell.
Those stats are remarkable largely because sharks kind of are hella dangerous:
Picture
​         The average size of all sharks in the ocean is about 20 feet long. But that's including that massive Whale and Basking sharks measuring regularly at well over 40 & 30 ft, respectively (comparable to the average American school bus, which is about 35 ft long and 40k lbs), and little Lantern sharks coming in at just over 6 inches (about 2 HotWheels diecast models, set end to end).
          A better statistic is the size of mid-size, in-shore species. The ones that actually encounter humans with any sort of aggression or feeding-curiosity are typically Bulls, Tigers, Hammerheads, and Reef Sharks (Nurses, Lemons, and a few others DO encounter people, but humans don't notice 99% of those encounters and they almost never even result in so much as a sandpaper scrape
          The sharks that interact with humans average 
10~12 feet in length, & weigh around 1,000 pounds.  The average car is about 12ft, with cars weighing about 2,000 pounds. (Smithsonian Institute)

​S
hark skin is used as sandpaper: 
​the dermal denticals (Latin, "skin teeth") are made of a durable and sharp material that can cut through stone (as seen in 
Egyptian records of obelisk refinement alongside sandstone) and has been used as industrial strength sandpaper in woodworking since the age of vikings and China's golden age of piracy. (Ocean Conservancy .Org)
Shark Teeth are scalpel-sharp: 
​       They were so effective as blades when embedded in weapons that ancient Hawaiians and Australian aboriginal peoples skipped straight over the developmental delays of the Bronze and Steel ages, resisting 18th & 19th Century imperialism partly because their stick weapons tipped with 
shark teeth could cut through most plate mail armor. They only lost out on dominating their homelands due to guns and disease. (Hawaii Alive .Org)
Sharks don't have feet.
​That might seem obvious, but it's significant because 
a Shark's primary sensory organ is their nose and mouth (you know the nose covered in industrial sandpaper and filled with teeth that pierce plate mail armor). If they see or smell something weird, they don't have a limb to poke it with. They have to bite it. (Smithsonian Institute).
Sharks don't have brakes.
​
Again, obvious, but also, think about it: this is a fish the size of a small car coming at you out of a vague curiosity at 10~20mph, with 
no brakes. So once it commits to investigating you at speed, it seriously commits and then can't stop if it realizes you're not what it was expecting. And if you move or thrash, you make the shark excited because suddenly the Weird Thing in the water might be alive, and then it assumes that if you're alive you might be yummy, and if you're yummy you'll run away, so it should grab you before you flee.
Summary:
         A small car, covered in industrial sandpaper, with pointy tippy bits that slice through steel, is coming at you at 10mph, with terrible eyesight, no brakes, and no pokey sensory appendages to nudge you with in any less-lethal way than a full on super-chomp. Human skin can be pierced with 3 pounds of blunt pressure. An infant human has the strength to kill a full grown man if the man offers no resistance.
       Honestly with how dangerous Sharks are just by existing, and how unbelievably fragile we pitiful humans are... Sharks are very clearly taking pains not to hurt us when they come up slow to investigate what the devil we even are.
Sharks truly do not deserve to be maligned in the way that they have been.
          And the victims of negative-outcome Shark encounters are their primary advocates. Most people who are bitten by a shark not only survive without blaming the shark for their injury, but they step up to actively protect sharks. An alarming number of people who should theoretically be the ones calling for shark culls have become shark ambassadors, researching them or aiding researchers. Hell, a sloid half of Discovery Shark Week's camera people are former 'shark attack victims' who want people to stop culling animals 'on their behalf'.

Sharks do NOT want to eat us.
        They just DON'T. As demonstrated by the wonderful (and mildly insane) Paul de Gelder in the Laws of Jaws from 2019. Sharks don't react to human blood as a predation trigger. They just don't. de Gelder cuts open a pint of human blood in a could around him while swimming with Reefies and Oceanics, and they don't even display a 'surprise' response. The blood may as well not be there... But a teaspoon of fish oil in the water? That draws in Oceanic White Tips from over a mile away within about 5 minutes.
         I just weep at how misunderstood Sharks are and Shark Week is one of the best ways in which the global culture is working to correct the mistakes of politicians and fear-mongers. And 2021 will be an EPIC year for it.
Robert Irwin (yep, son of my personal hero Steve Irwin) kicks with week off tonight at 8pm! And the rest of the week will continue to showcase the awesomeness of sharks!
Picture
I've acquired Discovery+ for the express purpose of watching Shark Week. 
​     
I will be eating nothing but cereal and cheep frozen pizza this month to support that decision but I've got no problem with that. And I'll be maintaining that habit for as long as Discovery leaves the Shark Week content up, so that there's a direct link in the in/out flux of payment attached to the presence of Shark Week content.
      It's not much, but that IS still something extra I can do to signal shark-support to the might of Capitalism.
​          If you would like to contribute to Shark Preservation in a non-monetary way, check out this year's Top Priority endeavor: Shark ENVOY, dedicated to removing drum lines from Australian coasts (particularly in Queensland). There's a couple of petitions and email campaigns being run right now to help push the government into acknowledging that public opinion has shifted.
         I may write up a few more Shark-themed posts this week, but I will definitely be pushing myself to get more Fanfic chapters out to literally bribe what audience I have to address into thinking differently about sharks. I sincerely hope that all of you take these animals to be the beautiful, critical creatures that they truly are.
🌊🦈🌊🦈🌊🦈🌊
0 Comments

A Court of Silver Flames - SJM | A Disgustingly Problematic display of Shameful Mental Health Handling...

6/23/2021

0 Comments

 

I wanted to like it, I truly did, but there is simply nothing here worth pretending is acceptable...

              Well, I have to say unequivocally that the very best thing about this book (and maybe even about the series as a whole) is the Making of the super-sassy House of Wind. The way it happens, how it becomes a sentient BFF that reads romance novels and gossips with the girls and mimics up tiny pegasi for kicks and sleep-over whimsies... It's GREAT. Everything else about this book? Not so much..
Picture
            There WILL be spoilers in this review, and crude language (not all of it me cursing at the thing, but some actual quotes & paraphrases that address SJM's tone)... But for now, let me say, this felt like a knock-off stage production housed in a low-rent theater, created by a psychotically invested artiste of a wanna-be director and acted in by 2 people who kind of care about the director and therefore managed to drag their half-resentful friends along to auditions...
          The prose is limp and lame, her dialogue tagging conventions are irksome, and the metaphors she uses are very galaxy-brain bits of nonsense that feel like symptoms a dangerous drug-high. Also. There's no plot. Like at all. (I'm all for character-driven stories, but seriously, for character-driven narratives to be at all effective, the characters have to be slightly more exciting than damp cardboard... Final non-spoilery points: there was too much sex that didn't do anything to contribute to the narrative, the mental health handling (read: dismissal) was pretty insulting, and it was way too frickin long a lead up for a predictable, boringly obvious ending that rendered all of the hyper-attentive foreshadowing utterly useless.
           Beyond all that, the new covers are hideous.
       I liked the House and the blossoming of female friendships. But that's about it...
            Strap in, folks, because this one's a ride. I nearly hurled a few times, definitely gagged.
          I wanted to like this one, I really did. It's just been a few years of SJM not writing up to her potential, but still showing that potential... I walked into this with low expectations, high hopes, and a careful detachment that would let me read this without my dislike of Nesta getting in the way of my enjoyment.
       Don't get me wrong, I don't hate Nesta. I've just always found her rather irksome; she was annoying but interesting enough that her purpose as a foil to her sisters worked rather well. She was never main character material because she was never meant to be. SJM actually did alright with fleshing her out enough to hold up a main character role, but Nesta just isn't a piece on the board. She's not significant enough to the world to NEED to be anything more, which means she's just there far too much of the time to be interesting as a central figure. 
           This isn't necessarily a bad thing. Any story needs its side characters. To promote a side character to MC status is difficult, but can be done, so long as you reframe the rest of the narrative along with them. SJM didn't.
Which means that all the elegant character building and world crafting she put into the first three (*cough* Frost and Starlight was a travesty that I will forever ignore *cough*) just evaporates. Because to make Nesta seem fleshed out and solid, the rest of the characters became pitifully flimsy cardboard cutouts of themselves.
         Like even Mor and Azriel were barely there and only the slightest bit participatory. It was tragic. Az had a few good quips, but it was an SNL skit of Clinton with Bengazi compared to the world-shaking political impact of the Shadowsinger's true glory.
           That was the main thing for me. This was a character-driven story without any truly great characters... And zero plot. 700-something pages of hints and preamble, but with zero follow through. 
​
Again, here there be spoilers. Very direct, spoilery spoilers:

Read More
0 Comments

the Summer Country by Lauren Willig | Slavery is not Sith you're looking for...

6/19/2021

0 Comments

 
Originally Published Sept 3rd, On Patreon.
          This was an excellent summer-story beach-read. I deeply enjoyed it. The pacing was not my favorite, but it was the expected vibe from a grown-up purportedly High Literary historical fiction novel.
Picture
      In a few too many ways, it was your standard 'slavery is bad' story, and it took the grown-up route of utter ignorance in pushing that line further to say 'slavery is evil and it's inconceivable that anyone ever thought otherwise unless they too were evil humans' which is just plain modernist and pathetically essential-ist.
    Yes. Slavery (particularly in the New World) was atrocious. 
    And Yes. To say otherwise today makes a person pretty much straight up evil.
      But that is not a truism. It was not always the case that to believe in slavery as a sort of 'natural order' made you a morally bankrupt person.
​     That is putting a modern lens on things that modern people cannot fully comprehend. Just like a grown up does NOT genuinely remember what it was like to be 16 or to be 10, a 21st century person cannot even recognize that they are viewing history through a very rosy lens. The truth of the matter is that in 1812, slavery in some form or other had been a fact of how society functioned for over 6 thousand years and it was honestly weirder for someone to say, 'You know what? This is probably bad...' and a lot of that even had to come from how New World took the established order and heaped an unbelievable list of extra abuses onto it. 
          Slavery, in most nations, came to a natural end as the societal system it supported evolved. In the Americas, that societal system was artificially and intentionally maintained by a sort of aggressive racism unique to the West. In most slave-nations prior to the West's development, slaves were not racially inferior or species-separate or anything like that. People of all races owned people of all races. It was just a money thing.
          This is not to say racism, wasn't a Huge Thing (Imperialism is a very terrible thing itself, and the subjugation of others based on country of origin is a long-standing terror of our humanity), but that's a separate statement. Racism and Slavery were not intrinsically bound together outside of direct and immediate conquest. Once a new place was conquered by the Empire, the old lowest-people rose up in the ranks, and if you had enough money you could buy a slave of any race even ones theoretically high up the ladder than you.
         That does, admittedly, vastly over simplify things. But I'm not trying to make a nuanced argument (at least no more nuanced than to make it clear I find both racism and slavery abhorrent).
         What I'm saying here is that this story should have had a few characters, both black and white, who believed in the institutions they were raised within without that believe automatically forcing them into villain roles. Fear of change, belief in the status quo, confusion about why it mattered so much to some people... All of that should've been more prevalent in this novel.
         The fact that there wasn't a single character in 1812 Barbados that fully believed in the current Natural Order who was not ultimately painted as an utterly depraved and immoral individual was just plain creepy.
        The concept of slavery didn't survive for SIX THOUSAND YEARS because everyone always knew, deep down, that it was wrong. We aren't a species of creatures so heinous that we can look at something we know is wrong for SIX THOUSAND YEARS without doing anything about it.
      We didn't do anything because we didn't see it as wrong.
      Honestly, at the heart of it, we sorta still don't.
   The concept of free labor hasn't gone away. Unpaid Internships are the modern indentured servitude. The requirement of X years of experience to allow you access to a job force you need to be involved with in order to survive is heinous.
     Yes, Interns have things like rights and safety recourse guarantees and legal backstops, but they're pretty basic rights. Your employer isn't even required to feed you, they're just required to give you a bit of time not-working to feed yourself, buying food to do so with money you aren't allowed to make (only 10 minutes minimum if you're working less than 5 hours and none at all if you're working less than 4). 
      Slavery, through most of its history, included ASPCA levels of animal-abuse-type protections. Food, provided freely and regularly; body security and autonomy (ie, no direct injury or sexual abuses); recognition for good service and the ability to be a person with a name and a backstory and HEALTHCARE (instead of just an employee number with the last-line protection of company liability pay if you get grievously injured on the job you don't get paid for).
      Were there abuses? Yes.
      Was most of human history a string of abuse after abuse? No.
    People voluntarily sold themselves into slavery, or at the very least, term-indentured themselves, pretty dang regularly throughout history.
      Because sometimes, the promise of regular meals and decent healthcare was legitimately preferable to starving to death. Like right now.
     Not kidding. There are legitimately countless studies out there of how MODERN PRISON is a preferable state of being for human than getting shunted into an unpaid internship. (Some of them are even legitimately academic and peer reviewed, but those take longer to find for free-viewing than I want to spend right now, so: here, here, here, and here, will have to do, though most of these are just about the poor ethics of the Unpaid Internship concept).
      And yet, thousands of people in America alone don't see the problem with it.
     So, likewise, thousands of people in 1812 Barbados should've not been able to see the problem with it. And as a pretty well-researched author, Willig should have known that and accommodated for including it.
       It is NOT COMFORTABLE to be lead through a story where slavery is just okay, I wholly admit that (and am frankly, glad for it).
      But literature is not supposed to be comfortable.      It's supposed to make you FEEL what the author thinks you should be able to SEE, because its right in front of your face and wrong but not acknowledged.
      I am not saying, in any way, that a slave-believer should have been the hero. But someone should've been, at least sympathetic, to the Status Quo.
      Also, there was just such a fixation on the 'Slavery as an evil institution thing', that the little love stories didn't get much attention which made them feel cute but rather hollow. I loved the moments we got to see the two couples being cute, but they were so few and far between that I got lost.
       I LOVED the comments on being so unsure of your own feelings that you make the mistake of wanting your partner to be sure enough for the both of you, but that was the only message in the story besides 'slavery is bad'.
         It was good, and a great beach read. Really goo. I deeply enjoyed it. 
        To be perfectly frank, while reading, I couldn't put my finger on why I didn't love it. And I couldn't figure out why I didn't want to review it until I sat down and started reviewing it, (I actually read this over my little vacation in the second week in August, and put off reviewing it until a few days before you guys see this post).
       But it wasn't literature, and I'm pretty disappointed in the lack of legitimate social commentary.
​
0 Comments

[Write Life] Struggles with Superpowers | the Ups & Downs of ADD

5/30/2021

0 Comments

 

​Originally Published
on Patreon
Nov 28, 2020
         ​I have a pretty severe case of ADD, also known as Type 1 Attention Deficit Hyperactivity Disorder. And by 'pretty severe' I mean utterly savage to the point that the dose of medication I need to function like a moderately normal human being is substantial enough that we have to monitor my heart so it doesn't explode (not kidding, my average resting heart-rate is ~113 bpm; normal-person avg bpm is between ~66 and ~86).
         ​While ADD is dramatically different for everyone, there are some consistencies that are used as diagnosis tools. The obvious one is a lack of ability to pay attention, even when the desire to pay attention is present.
         ​The best way I can put it is that ADD is like the last 3 days of being out sick with a really bad cold. You're better enough to be pissed off at how exhausting it is to simply exist, and while your brain is functional enough to recognize that it's not stuck in the sick-fog sleepiness, you also have to spend five minutes staring at your TV remote to figure out what button turns the dang thing on. ADD is a wall of that feeling that doesn't ever really go away. With meds it gets better, like staring at a room through a window instead of through an opaque closed-door, but it never goes away.
         ​And yet, there is a converse to that: Hyperfixation. Not everyone with ADD has this, but a lot of Type 1's do, we hyper-fixate instead of exhibit hyper-activity.
         ​Let me tell you, it is a goddamn SUPER POWER. Sorta. A lot of people with ADD hate to have this piece valorized because it's actually far more dangerous to our direct health than any other aspect of our condition. That's a very fair assessment. I have hyper-fixated to the point of forgetting to eat for 48 hours. I have forgotten to drink water for almost 24 hours. I have forgotten to breathe for about 5 minutes and only managed to avoid passing out because I was already laying on the floor. I chew through my fingernails and rip out my hair and dig furrows into my skin that bleed for ten minutes before I notice them--because the motion is somehow helpful to my focus-clarity and my fixation means the part of my brain that feels pain is just turned off for the moment..
         ​Hyper-fixation is legitimately dangerous.
         ​But it's also really really cool.
         ​I have a partially eidetic memory, which means if I have heard a song I know it. If I've read a book, I remember almost all of it. If I have handled an object in my entire life and no one else has touched it since I last did, I can navigate to the geographic coordinates of it, blindfolded. I can research like BAMF, and I can instant-recall all of it on the spot and synthesize it into a legitimate argument at the drop of a hat. I can give college-level dissertation oral defenses on topics I've only skimmed the readings for, in after-hours presentations that I only remembered to attend because I left my water bottle behind after lecture... My brain is frickin' MAGIC.
         ​At least on good days.
         ​Honestly, on good days, the fact that this will probably kill me very young is a trade I could as entirely worth it. If this is my deal with a devil, man, I'm a-okay with the payoff.
         ​On bad days, it's extremely difficult to remember why walking backward in the rain across a 6 lane highway is a rather poor life decision... Seriously, most of what keeps me grounded on bad days is remember that I have written records showing that on good days I think it's totally worth the bad days.
         ​ADD, like everything else in the Universe, is cyclical. ADD is unique in that it has 3 separate layers of things operating on an influential cycle. There are some Psych Journal research articles on the topic, but I don't like how most of them describe the issue, so I'm gonna break it down for you myself:

- Engagement vs Disengagement
Am I able to legitimately engage with whatever I'm working on? Or am I looking at it through a plexiglass wall? This is the piece that is most easily influenced by medication. On bad days, the meds mean I'm still trying to play Operation while wearing wool mittens, but on good days, the meds let me manipulate my mental puzzle pieces on an atomic level to make them fit together perfectly. This is both the shortest cycle and the longest cycle. This because there's the daily circuit (pre-caffeine, pre-meds fog of nothing, then the productive bliss of extended release capsules, and then the nightly 'oooooh, yeah everything helpful has worn off') and a longer cycle where I'll just hit a mental block that keeps me from engaging with anything for a random month or two.

- Positive Feedback vs Negative Feedback
Is what I'm looking at interesting enough to make me ask another question, and is the answer to that question enough to spawn yet more? 'Positive' here doesn't necessarily mean 'good', it's used in the scientific way of 'increasing acceleration'. Anything that doesn't increase the interest / fixation level contributes to it slowing down. This cycle is the most obvious to me while I'm moving through these cycle-processes, and it's one I can effect. The moment I feel the acceleration of interest slowing down, I can stop, pull back, and take a break. Pushing against negative feedback does nothing but hasten burnout, which is AWFUL. Avoiding burnout is why I have queues set up and why working 8.5 hours in a boring office job would probably kill me.

- Present vs Dissociative vs Concurrent Mindset
A Present mindset has me fully aware of myself, my environment, and the intricacies of my current task. A Dissociative one has me exclusively aware of my current project and all it's permutating aspects, my body is not a real object and the outside world could burn for all I care. Both of those are productive, though only one is at all close to being 'healthy'. A Concurrent mindset has both awareness of what I should be doing (eating, homework, etc), and what I want to be doing (current project), occurring within my brain at the same time, without either of them really clicking into place properly. This is the mindset that has me staring at walls and into water bottle and just off into the middle distance. It's horrible because I can't focus on the thing I want to focus on, and I feel guilty for wanting to focus on a thing that is not the thing I should be focusing on, and we spiral into angst from there.

         ​And then all of that is further complicated by Focus Fatigue. Also known as Hyper-Exhaustion or Burnout, this state of being happens when all the lowest points of the other three cycles line up with each other. It comes at random, though I can usually feel it creeping in a few weeks before it really hits, and it stays for a random duration. The only effect I can have on it is to not prolong it by trying to just push through. Accepting that I'm in a down-beat and just riding it out without attempting to DO anything is usually the best bet I have to hasten the recovery process.
         ​This is what I've been dealing with from the end of October and all through November.
         ​I've just got nothin' left to call on. I've been playing lots of Solitaire and a hidden-object game called SeekersNotes. I've just been letting my queue run itself out while hoping that I get my mojo back before it hits the end of its schedule.
         ​It's possible that things will get back on track soon, but it's hard to tell. ​The best way I can characterize recovery is to compare it to alcohol consumption. When you're just sitting at the bar, you don't feel much different between your first shot and your fourth, a little buzz turns into to a little buzzier, but nothing major. But then you try to get up for some reason and you realize that the room is spinning and gravity is gone...
         ​By the time you realize your drunk, your very drunk.
         ​Likewise, by the time I realize I've got my focus back, I've been doing focus-related things for a while already and doing them successfully.
         ​I'm hopeful that this Burnout will be over soon (being able to succeed in writing this is a very good sign), but I can't guess how long it'll take me to go from tipsy with a drip of focus to full on smashed with Hyper-Focus... 
It could be a matter of days, or it could take a few more weeks. We'll just have to see.
         ​Wish me luck!


(Fortunately, while I'm waiting, I've got plenty of little objects left to find in Seekers).

​May 31, 2021 Edit:
Since this post was originally published, I HAVE since had several periods of high productivity, one of which is currently still on-going!
0 Comments

The Last of Us: Part II - Triggering, Cyclical, and Just plain Pathetic...

5/23/2021

0 Comments

 

I enjoyed the first one, wanted to like this follow up... but just could NOT.

Picture
     My Associated Human and I have been playing through this together the last couple of week and it has over all been a deeply enjoyable experience, though 2 big sticking points kind of ruined it for me. This review will have specific spoilers, so short review here and then a detailed one below the 'read more' jump.
      Long story short, I DO NOT recommend this one. At least not buying it. Find a friend and borrow it, but do NOT give Naughty Dog more money for this vile piece of triggering shit. 
    It's a huge shame that I hate it like I do. The gameplay was fun, the graphics were astoundingly gorgeous, and most of the story was deeply enjoyable and extremely well-written... I fricken LOVED most of it. But I fucking HATED the ending. 
       Firstly, I just disliked it. Secondly, it was cheap-ass story telling.
       Thirdly, it was an example of a grown ass man foisting his own damn trauma onto a barely-hanging-on teenage girl (whom his dead brother loved like a daughter) and explicitly asking her to do something he knows will be traumatizing because he is too physically disabled to do it himself.
       And finally, that ending was a display of grossly misogynistic violence that the player is forced to carry out actively and personally, not watch as a cut scene.
      Which all combined to absolute, visceral hatred of the ending for me.

      Reminder, from here on out, there will be explicit spoilers.


Read More
0 Comments

Baby Boy - the Cuter than Cute Philosophy of a KMV Explained.

6/5/2015

0 Comments

 

'Aggressive' and 'Adorable' are NOT Mutually EXCLUSIVE: 

Picture
           High4 is not a band I've been terribly enamored with I loved their debut, but most of what they've done since just hasn't struck my fancy. This one is extremely different; I like it so much that it's even jumped the queue of MVs I should be reviewing! 
           From the moment I first saw it, my only reaction has been "Yep. That's it. We're done here. The most adorable song humanly possible has been produced. We can all go home, now". And that feeling hasn't dissipated at all in the two days I've been watching it while pretending to care about other things. And now I just HAVE to explain the full extent of its intricate weave of imagery, because there's much more than meets the eye to this release and it would be a shame for anyone to miss even the tiniest detail. What seems at first like a mess of contradictions is actually an array of perfectly inter-locking pieces that create a dynamic and accurate depiction of an enchanting reality.
           Kpop has done things were 'cute' and 'gangsta' overlap, there's been other examples of aegyo mixing with straight-up awesome, but nothing else I've ever seen does it quite like this; nor nearly this well. Most other versions of an attitude that resemble this one use aegyo to straightforwardly mix in a dose of adorable, but this release takes that a leap forward by avoiding straight-aegyo, for the most part, and using a more sophisticated weave of more traditional imagery to convey the idea that the object of the narrator's affection is a singular point of exception in his world. Every single detail in the visuals plays off the notions in the lyrics, leading to an achingly cute final product that takes its message more seriously than I would have ever guessed just by glancing at the colorful screencaps.
           It also has something very important in common with Beyonce (but we'll get to that later).

Read More
0 Comments

Just Tell Me - THE (SURPRISINGLY) Sweet PHILOSOPHY OF A KMV EXPLAINED.

5/14/2015

3 Comments

 

True Love Is Hard to Define, But It's Pretty Easy to See...

Picture
        With their lovely So Too Very Much release in February, and their March activities in Japan, I wasn't expecting anything new from these guys for a while yet. I've been pleasantly surprised by their Korean return and the release in question is itself absolutely fantastic. And when I say fantastic, I mean FANTASTIC. This is easily going to stand up as one of the very best releases of the entire year, it's going into my history books as one of the most fabulous releases EVER.
          From the set aesthetics and the basic styling, it looks like Just Tell Me can't possibly be anything to terribly spectacular. MYNAME isn't really a band with a track record of deeply meaningful or profoundly impacting releases, so when combined with a concept that looks to be more of the same old, hiphop-grunge-but-still-clubbing-chic that's been flooding the market lately there was really no reason to suspect that this release would be incredible. Honestly, there were a dozen pressing reasons to think this release would be anything but awesome, including the fact that MYNAME has already had a comeback this year, and they've had activities in Japan, so theoretically they haven't had much time to spend on this; not to mention the fact that the concept seems blandly unoriginal and the the Choreo from the teasers has some moves that stand out as painfully frequent instances of recycling moves.
          HOWEVER, to anyone who believed the teasers or who concluded on a cursory examination that it really is just another sex-focused grungy hiphop club thing: YOU GOT PLAYED. The whole thing is one big joke on us and the best part of it is that our expectations were used against us by preparing us for something mediocre, but what they gave us was something FABULOUS. Unfortunately, there are still a ton of people who are taking the release at face-value and as it's easily the best thing MYNAME has ever released, I will not suffer it to be maligned by people who won't give it more than a cursory look to allow the true elegance, sweetness, and complexity to unfold.

Read More
3 Comments

Xiah Junsu's "Flower" ... Crazy Mess or Cultural Critique?

3/6/2015

2 Comments

 
THE PRODUCTION COST OF "FLOWER" IS THROUGH THE ROOF, BUT DID JUNSU GET ENOUGH BANG FOR HIS BUCK? THE MESSAGE IN IT IS A HAZY, SILENT SCREAM TO THE VOID, MADE WITH FLASHY SYMBOLS AND GLITZY VISUALS BUT WHAT IS IT REALLY SAYING?
Picture
          There is a HELL of a lot going on in this MV and some of it is really not all that slick. There's a ton of good stuff going on, but there's a lot that I'm disappointed in as well. I keep waffling back and forth. Junsu's voice is unquestionably one of the best in Kpop, but it's usage here doesn't make it obvious... And while there's a lot of gorgeous symbolism in the visuals, the way they're strung together without exposition leaves me less than impressed... Arguably, it's a flashy, but poignant critique of the whole nature beneath Idol culture, one I personally find unduly antagonistic. 

           But in order to critique it fairly, I think I should start by explaining what on earth is happening in it:

Read More
2 Comments
<<Previous
    Come check out my Patreon Page for the latest updates and insider exclusives!
    If you have Questions about my Terminology, Bias Bands, or any other FAQ click HERE.

    Categories

    All
    Business
    Capitalism
    Ecology & Conservation
    Gender Issues (Fem/Mas/Misc)
    Historical Accuracy
    History Is NOT Your Buddy
    International Media
    Internships
    Mental Health
    Musicianship
    Mv Explanation
    Mythology & Folklore
    Narrative
    Personal Issues
    Philosophy
    Pop-Culture
    Response
    Rookie Groups
    Seoulbeats
    Sexuality Issues
    Shark Week!
    Slavery
    Socio/Geo Politics
    This Is WAR.
    Trainee Systems
    Transmedia
    Triggering & Problematic
    Vampires
    Video Games
    What Makes It Literature?
    Write Life
    Writing Stage: Burnout / Reset
    Year End Assessments

    The name, layout, opinions, and analysis displayed on the Music Matters® website belongs to A-style® and are the intellectual property of Alexandra Swords.

    Archives

    October 2022
    March 2022
    July 2021
    June 2021
    May 2021
    June 2015
    May 2015
    March 2015
    February 2015
    January 2015
    February 2014
    January 2014
    December 2013
    September 2013
    August 2013
    July 2013
    June 2012
    March 2012

    RSS Feed

Powered by Create your own unique website with customizable templates.