The Ubiquitousness Of Dark Management Are Partly Sustained By Prolonged Bad Uses Of Technologies

Dark Management
13 min readMay 15, 2022

The core insight of all this is the following:

Good uses of technologies are those enhancing and supporting the healthy uses of human abilities and inner resources while integrating and simplifying the healthy human needs and wants; Bad uses of technologies are those replacing and weakening the healthy uses of human abilities and inner resources while complicating and fragmenting the healthy human needs and wants.

As an example, according to this insight, using abacus, as long as it’s not used for very complex calculations, is a good use of technology, because its users are still the ones actually performing the calculations, although this act is augmented by such a tool; Whereas using electronic calculator, unless when it’s used for very complex calculations, is a bad use of technology, because it does all the calculations for its users when they should be able to do so themselves, meaning their abilities for simple calculations are being removed by such a tool.

Another example following this insight is automated automobile production, even though it replaces instead of augmenting the human when it comes to those concrete production steps involved, it’s still a good use of technology, because many of those concrete steps would be far too inherently dangerous and error prone for human if they’d have to do those themselves, meaning that performing those steps manually wouldn’t be a healthy use of human abilities to begin with.

As the mankind rely more and more on the bad uses of technologies and less and less on the good ones, the human abilities and inner resources become more and more puny and depleted, while their needs and wants become more and more complex and inconsistent, causing the mankind to become even less and less able to rely on the good uses of technologies, meaning that they’ve no choice but to rely even more on the bad ones, implying that this will become a vicious loop that’ll become harder and harder to break.

Implications

One of the most direct and severe consequences is that the human activities and organizations will have to become more and more complicated and fragile as well, causing the number of single point of failures and total complexities of the mankind as a whole to skyrocket, eventually to the point that such fragility and complexities will become far beyond what the mankind can understand, let alone handle well, and this means that someone has to (probably unknowingly)handle much more complexities and external failures than what should’ve been handled by that someone.

Normally, those being most able to handle such extra complexities and failures should be the ones handling them, but because they’re also the ones being most able to push away those complexities and failures to the less able ones, and it’s only human nature to choose what seems to be the easier and simpler ways for their own for now, those extra complexities and external failures will eventually be pushed to the least able ones, causing them to be on the verge of breakdown.

However, if those least able ones do just break, then the second least able ones before the break will become the least able ones after the break, so the latter ones have no choice but to do whatever they can to stop the former ones from breaking down, or the latter ones will soon face the same fate.

As the same will eventually apply to the even more able ones if they do nothing, all this means that the least able ones have to take all the extra complexities from the rest on one hand, but the latter ones will do whatever they can to stop the former ones from breaking down on the other, and this kind of perpetual motion fantasies are the causes of the paradox of the latter ones relying on the former ones(as the perpetual motion machines) yet treating them as if they’re the enemies to the societies(they fail to achieve the impossible the others rely on).

That’s one of the most dominant motivations for a human in this twisted system to ascend in the food chain — The further they’re away from the very bottom, the lower the risk for them to have to face the real dangers of breakdown, even when it means that someone else will have to break instead.

Using Dark Management

But, the biggest problem is that, as long as human rely more and more on the bad uses of technologies, no matter what the mankind do, quite a large portion of the population have to be at the very bottom of the food chain, and the entire civilization can outright collapse if they refuse to handle those extra complexities and external failures no matter what, so the society needs a seemingly sustainable mechanism to ensure that those people will do what the rest want them to do, and that’s where dark management comes in.

More specifically, the dark managers need to accomplish these goals:

  1. Rationalizes the necessity of this absurd food chain
  2. Justifies the ridiculous claim that those being on the very bottom of the food chain themselves are the causes for them to be there
  3. Emphasizes that someone has to handle those extra complexities and external failures or the society won’t even run anymore
  4. Convinces the others that it’s better for those being on the very bottom of the food chain to handle those extra complexities and external failures
  5. Shifts all the blames from the over-reliance of the bad uses of technologies to those being on the very bottom of the food chain for the extra complexities and external failures that aren’t being properly handled

When all these goals are accomplished, the mankind will on one hand think that it’s the fault for those being on the very bottom of the food chain to fail to realize the perpetual motion fantasies, but on the other hand fail to realize that it’s just a fantasy after all, regardless of how much they rely on it.

As for how to accomplish these goals, the societies already have well-established systems to do that, even though this pandemic has undone much of its works by clearly exposing just how brittle our civilization really is:

  1. Hides the fact that we’ve far many more bad uses of technologies than the good ones by pretending that we’re evolving when it comes to using technologies while we’re in fact deteriorating when it comes to healthily using our abilities and inner resources(which become more and more dormant and limited) to fulfill our healthy needs and wants(which become more and more colossal and complex)
  2. Focus on the fact that everyone has a chance(no matter how slim it’s) to ascend in the food chain while evading the fact that someone has to be at the very bottom there no matter what
  3. Eludes the truth that this fact is actually caused by the over-reliance on the excessive bad use of technologies by the mankind for far too long
  4. Exploits the fact that this will cause everyone to be motivated to ascend in the food chain and prevent the most able ones from refraining to be on the top while not mentioning that those being most able to handle the extra complexities and external failures should do their jobs instead of forcing the less able ones to do those for them
  5. Persuades the mankind that perpetual motion is only a fantasy in physics but a practical reality in human psychology by strengthening unfalsifiable beliefs rather than listing tons of concrete and sound scientific evidences

Counters

So, for those hating dark management and dark managers so much, what should be done to end this madness? While you’re likely to have an extremely strong urge to use loads of undeniable facts and solid reasons to ask those dark managers to stop using dark management and be responsible for trying to sustain such an insane food chain, this just won’t work at all.

It’s because those dark managers are one of the few who can benefit from such a crazy system, and as long as they’re sure that the system will only shatter long after they’re died, they’ll have absolutely no incentive to change anything, and as they’re the ones currently having the most power, any direct approach will be incredibly costly and tough uphill battles that will end very badly.

Instead, the system should be weakened by its victims by decreasing their bad uses of technologies while increasing their good uses of technologies, because only then they can weaken the very foundation of the reasons of the existence of the system, and thus the apparent necessity of dark managers employing dark management, all without having to directly confront the far greater power.

You may think that it’d be very unfair to those victims because those dark managers are the ones who should be held accountable for this, and I also once felt that several years ago, but now I believe that indirect approaches are the only ways that can actually go to somewhere meaningful in such cases, and the best way to take revenge to those bastards is to become independent enough for them to fail to directly control their victims.

More specifically, the aforementioned indirect approach can be broken down into the following key points:

  1. Realize just how much we’re using technologies badly and wisely respectively
  2. Realize what healthy needs and wants we’re trying to fulfill when we’re using technologies badly
  3. Try to use technologies wisely instead of badly while trying to fulfill those healthy needs and wants while be aware of the existence of every bad use when it can’t be replaced by wise uses
  4. Realize what healthy and unhealthy needs and wants we’re trying to fulfill respectively
  5. Try to convert those unhealthy needs and wants into healthy counterparts while be aware of the existence of every failed conversions

For instance, if you’re not a native English speaker and you’re learning English effectively through playing video games in English(even though it’s not even your primary goal), then as long as you’re doing so in moderation, it’s still a good use of technology, because learning a useful language is a healthy need and using language is a healthy use of human ability.

Whereas, if you’re playing video games all the time but merely as a form of an addictive drug to escape various forms of mental discomforts without learning anything useful, then it’s very prone to be a bad use of technology, because the underlying needs could’ve been achieved though mindfulness exercises like meditations, and such mental escapes is an unhealthy want and this addiction will weaken the self-awareness, which is a healthy use of human abilities.

Another example is that, when you feel hot, do you just open the air-conditioner in your home without a second thought? Or is it really so hot that you’ll need to open it so you can sleep well tonight? Are you aware that needlessly opening it for a needlessly long time will reduce the ability of your body to sense the temperature in the environment and adapt to a wider range of temperatures? So even when you do need to open it, at least you need to really know what you’re truly doing, otherwise the apparent convenience of using technology without a second thought can actually reduce your consciousness by causing your actions to be unknowingly more and more automatic overtime.

Expectations

Of course, it’s impossible to never ever act on unhealthy needs and wants, and it’s impossible to never ever use technologies badly — such perfectionistic obsessions are in fact themselves unhealthy to the extreme, but the point remains that, as one gradually acts less and less on unhealthy needs and wants(at least not without awareness), and as one gradually use technologies more and more wisely(and more and more aware of the bad uses), that person will become less and less entrapped into the absurd food chain, while being more and more aware of those dark managers behind the dark management, causing that person to be harder and harder to remain deceived.

There’s no need to be afraid that those dark managers will directly control our abilities and inner resources, let alone our needs and wants, as long as we know what they’re trying to do and don’t let them do so, because if they could do that, they’d have already done so on those at the very bottom of the food chain to really realize the perpetual motion fantasies, thus perfectly covering all the problems in this absurd food chain and removing the need for any other dark management used on anyone else.

Needless to say, it’s impossible for an isolated person to change anything in this twisted system, but when more and more people within have the aforementioned changes slowly, the needs and wants of the mankind will become more integrated and simpler, with their abilities and inner resources to fulfill those healthy needs and wants to be more abundant and stronger, so the total complexities of the societies can decrease instead of increase, and the reduction of the unnecessary reliance on the bad use of technologies will cause the civilization to be less fragile.

If this continues, eventually there might be times where the total complexities will be small enough for the human to be able to understand its totality again and thus handle them well once more, and the civilization might become robust enough to have few enough single point of failures(even though it’s impossible to totally eliminate them all), then it’d be the point where the absurd food chain will start to slowly crumble on its own, because the needs of pushing away those extra complexities and external failures, and thus the need to ascend at the cost of having someone else to be at the very bottom, will gradually become smaller and smaller.

Of course, this itself won’t be even close to an utopia, because bad uses of technologies is far from being the only factor sustaining dark management, and those dark managers will still try to sustain the insane system even when it’s not needed anymore, but at least it’d still be a significant step forward, since now more and more victims will realize what’s really going on, and it’d be harder and harder for those dark managers to find excuses nor lies to sustain their dark management.

When those dark managers have to find new excuses and lies to sustain their dark management, it’d actually be a chance to discover the underlying genuine global problems the mankind must face, and if similar indirect yet working approaches can be found and implemented, those dark managers will have fewer and fewer excuses and lies available, eventually to the point that they’ll fail to sustain their dark management or at least force them to evolve into the best dark managers, which is much, much less harmful to the mankind than what they currently are.

Bad Uses Of Technologies

As for why over-reliance on the excessive bad uses of technologies will cause the total complexities of the mankind to skyrocket and the civilizations be much more fragile by having many more single point of failures, just think of how complicated the currently human needs and wants compared to those just before the 1st industrial revolution, and how brittle(mainly caused by globalization due to technological advances) the modern society compared to that era in a global sense.

If the civilization were to be treated as a software system, the codebase scale of the modern society would be much, much bigger than that in the Renaissance era, and the former would’ve much more advanced abstractions and sophisticated constructs than the latter, so the amount of software engineers(powerful figures who can directly shape the world) in the former would also have to be a lot more than that in the latter, meaning that there will inevitably be more power struggles and more complicated teamwork communications followed by higher communication costs.

When it’s very hard for nontrivial abstractions not to leak in nontrivial ways and there are so many abstraction layers, if many of such leaks almost happen at the same time, it can effectively cause a perfect storm that might sometimes even crash the system that has to be reset, especially when many of the software engineers involved don’t know nor care about those leaks, because it’s unlikely that they’ll suffer from those leaks themselves.

For instance, fossil fuel power stations were once seemingly flawless advanced abstractions with very sophisticated constructs back when they were just invented, because it makes electricity, which is an extremely convenient form of energy, available to most people around the world, all without knowing that it’ll eventually pollute the world so much that it becomes one of the biggest factors in global warming, a much bigger problem(the risk of human extinction) than not having a convenient form of energy, meaning that such abstractions are so leaky that they create more problems than what they solve.

Unfortunately, some newer technologies are invented just to solve the problems caused by the older technologies, without knowing that those newer technologies will create even greater problems many decades later, and it’s like those novice programmers using even more problematic abstractions to cover the existing leaky abstractions that are actually less problematic, all without trying to figure out where the leaks are coming from, and more importantly, the root causes behind this problematic programming approach.

Better Approach

So, if back when the fossil fuel power stations were just invented, and the stakeholders had tried to thoroughly assess the impact of this invention, with a more conservative approach to first launch some pilot schemes of using this invention, then gradually making more and more places to use it, instead of quickly making the worldwide to adopt so, the problems caused by this invention would’ve been noticed much earlier, and it’d be much, much easier to mitigate than what we now have to face.

Of course, what I’ve just said would be far too ideal to be pragmatic, not only because the mankind is well-known for its unconscious instinct to needlessly sacrifice the future generations to unsustainably benefit themselves right now, but also because many of such inventions involve hefty amounts of up-front inventions with unbelievably high risks, meaning that the stakeholders involved had to get their returns on investment as quickly as possible, or almost no one(sometimes even the very government itself) would be even interested in investing on inventions anymore, which would cause even greater problems of severe technological stagnation.

But this only proves the dangers of the unknowingly bad uses of technologies, because technologies should only be invented to solve problems on satisfying healthy human needs and wants or enhancing and supporting healthy uses of human abilities and inner resources, all without creating even greater such problems later on if possible, while the more the mankind uses technologies badly, the more likely this rule of thumb will be unnecessarily violated, causing the vicious loop to using even more problematic technologies to cover the problems of the less problematic ones to be harder to break.

Yet, this doesn’t mean that our ancestors are to be blamed, because what I’ve said demands the mankind to have a solid understanding on how the healthy human needs and wants work in details on their own, and it’s clear that it’d be tough for them to have this level of understanding back then, so instead of scolding them for sacrificing us just to serve themselves, it’d be much more constructive for us to try to change course now, by trying to slowly convert more and more of our bad uses of technologies into good ones, just to hope that we’re still not too late.

Otherwise, if we let those dark managers to utilize our bad uses of technologies to sustain their dark management, the mankind might one day destroy itself by obliterating the very ecosystem it had been relying on for so long, just like how cancers kill themselves by killing its host, even though those dark managers won’t care about this a bit, since the current ones would’ve been all died long before this happens, and it’s only natural for us not to sacrifice our instant comfort for the mankind as well, since the best interests of our future generations are completely opposite to ours in such issues.

--

--