Let’s say there are two people talking: X and Y. X has three kinds of knowledge: Basic, Pertinent, Abstracted. Y has only two kinds of knowledge: Basic, Pertinent.
S1: If Y argues that X’s knowledge of the abstracted does not qualify him to be more knowledgeable than Y on qualitative terms because it is not useful knowledge,
S2: If X argues that Y’s lack of knowledge of the abstracted qualifies X to be regarded as qualitatively more knowledgeable than Y, and refutes Y by claiming Y cannot judge the usefulness of knowledge of the abstracted because Y does not have it,
A1: If Y counter-claims that X’s refutation of Y is simply because X possesses some other kind of knowledge and hopes that it will be useful,
C: Then Y’s independence of the knowledge of the abstracted and X’s dependence on the knowledge of the abstracted are either
C2: Meaningful; if meaningful, then cannot be established in terms consistent with the other’s perception because there will always be reasonable circumstances in which the claimant can be tautological and the defendant, contradictory (i.e., incompleteness).
Effectively, this conclusion signifies the incapacity of anybody, through any logical means, to establish that there exists an absolute perception that everyone adheres to concerning an arbitrary object.
Here, the assumptions are
V1: That the object being perceived may be engaged with the human senses through the knowledge of the object’s function and purpose
V2: That the act of perceiving is contingent upon pre-existing knowledge and isn’t therefore a “learning experience”
V3: That there is no way to demonstrate the usefulness of any knowledge independent of the perception modality (i.e., if there is no way to establish literally and meaningfully the significance of some knowledge – in the form of grammatically secured sentences or actions – then that knowledge pertains to a logically inconsistent hypothesis. E.g., sentiments.)
Q: Are truths simply objective reasons whose truth-values may or may not be verifiable?
This question seems to possess a native paradox, but that simply arises from a logical error in the semantics: we can’t address unverifiable statements as “truths”. Instead, they are logically contingent statements.
Even so: As Wittgenstein says in the preface of his Tractatus Logico-Philosophicus, “In order to draw a limit of thinking, we should have to think both sides of this limit.” Similarly, in order to establish the objectivity of a statement, its subjectivity must be conclusively denied as well as its independence of subjective considerations verified.
The attainment of these conditions can be explored through Sir Ayer’s verification principle, the tenets of which were established in his 1926 opus, Language, Truth and Logic. However, it must be noted that Ayer denied, reasonably, that unempirical hypotheses may be formed on the basis of empirical engagements with reality. By extension, there exists an inherent denial of any transcendent reality, which in turn eliminates the presence of any objective truths.
At the same time, however, there exist objective literal truths, which are closer to being tautologies than truths themselves simply because they are a repetition of meaning whose propositional variables are actually fixed and whose truth-value is also fixed.
During an argument, negation and affirmation are used to establish the value of a propositional formula. The formula could be any statement whose propositional variables can assume different values. For instance, the statement S has an unverified propositional value.
S: Smoking is disagreeable; drinking is agreeable.
To some, S will make sense while, to some others, S won’t make any sense at all. In order to establish the truth-value of S, we explore the existence of a logical system that is consistent with the value of S being both true and false. This is unlikely because it contradicts our logical framework itself. Then, the next step is to understand the structure of a logical system in which S is either true or false and such that the value of one propositional variable impacts the value of the second propositional variable directly.
In other words, we make S a formula with two variables, X and Y, and find out how the values of X and Y are consistent/inconsistent with each other while they exist in the framework of the same set of logical principles.
S: X • Y
If we now hypothesize that X cannot retain its value while Y’s value is held fixed, then we pursue the negation of this hypothesis in order to establish that S is true. If we affirm the hypothesis, then we will prove that S is false. In the course of either of these arguments, we repeatedly hypothesize and evaluate the truth-value of each, and proceed until we have with a hypothesis that corroborates or denies the parent hypothesis and so renders the statement as either true or false.
However, if a rhetorical tautology cannot be assumed to constitute a reason (because it is a repetition of meaning), and if Wittgenstein’s proposition that tautologies are statements deducible logically and therefore meaningless is true, then the tenets of propositional logic are neither tautologies nor analytic truths.
Moreover, no literal significance can be assigned to logically valid statements according to Sir Ayer! In this context, the existence of any literal significance of logically valid statements depends not on their analytic proposition but their synthetic proposition – as affirmed by Sir Ayer. (Here, according to George Berkeley: “esse est percipi”!)
Curiosity can be devastating on the pocket. Curiosity without complete awareness has the likelihood of turning fatal.
At first, for example, there was nothing. Then, there was a book called The Feynman Lectures on Physics (Vol. 3) (Rs. 214) in class XII. Then, there was great interest centered on the man named Richard Feynman, and so, another book followed: Surely You’re Joking, Mr. Feynman! (Rs. 346) By the time I’d finished reading it, I was introduced to that argumentative British coot named David Hume, whose Selected Essays (Rs. 425) sparked my initial wonderment on logical positivism as well as torpor-inducing verbosity (in these terms, his only peer is Thomas Pynchon (Against the Day, Rs. 800), and I often wonder why many call for his nomination for a Nobel Prize in literature. The Prize is awarded to good writers, right? Sure, he writes grandiose stuff and explores sensations and times abstract to everyone else with heart-warming clarity, but by god do you have to have a big attention span to digest it! In contrast: Vargas Llosa!).
I realized that if I had to follow what Hume had to say, and then Rawls, and then Sen (The Idea of Justice, Rs. 374) and Kuhn (The Structure of Scientific Revolutions, Rs. 169 – the subject of my PG-diploma’s thesis) and Kant, and then Schopenhauer, Berkeley and Wittgenstein, I’d either have to study philosophy after school and spend the rest of my days in penurate thought or I’d have to become rich and spend the rest of my days buying books while not focusing on work.
An optimum course of action presented itself. I had to specialize.
But how does one choose the title of that school of thought that one finds agreeable without perusing the doctrines of all the schools on offer? I was back to square one. Then, someone suggested reading The Story of Philosophy (Rs. 230) by Will Durant. When I picked up a copy at a roadside bookstore, I suspected its innards had been pirated, too: the book would have been more suited in the hands of one in need of a quick-reference tool; the book didn’t think; the book wasn’t the interlocutor I was hoping it would be.
I wanted dialogue, I wanted dialectic in the context of Heinrich Moritz Chalybäus‘ thesis (Systems of Speculative Ethics as translated by Alfred Edersheim, 1854 – corresponding to System of Speculative Philosophy by G.W.F. Hegel). I wanted the evolution of Plato (The Republic, Rs. 200), Aristotle (Poetics, Rs. 200), Marcus Aurelius (Meditations, Rs. 200). That was when I chanced upon George Berkeley’s Principles of Human Knowledge (Rs.225)and Three Dialogues Between Hylas and Philonous (Rs. 709). Epistemology began then to take shape; until that moment, however, it was difficult to understand the inherently understood element as anything but active-thought. It’s ontology started to become clear – and not like it did in the context of The Architecture of Language by A. Noam Chomsky (Rs. 175), which, to me, still was the crowning glory of naturalist thought.
Where does the knowledge, “the truth”, of law arise from? What is the modality within which it finds realization? Could there exist an epistemological variable (empirically speaking) the evaluation of which represents a difference between the cognitive value of a statement of truth and that of a statement of law? Are truths simply objective reasons whose truth-value may or may not be verifiable?
Upon the consumption of each book, a pattern became evident: all philosophers, and their every hypothesis, converged on some closely interrelated quantum mechanical concepts.
Are the mind and body one? Does there exist an absolute frame of reference? Is there a unified theory at all?
Around the same time, I came to the conclusion that advanced physics held the answers to most ontological questions – as I have come to understand it must. Somewhere-somewhen in the continuum, the observable and the unobservable have to converge, coalesce into a single proto-form, their constituents fuse in the environment afforded them to yield their proto-reactants. Otherwise, the first law of thermodynamics would stand violated!
However, keeping up with quantum mechanics would be difficult for one very obvious reason: I was a rookie, and it was a contemporary area of intense research. To solve for this, I started with studying the subject’s most pragmatic parts: Introduction to Quantum Mechanics by Powell & Crasemann (Rs. 220), Solid State Physics by Ashcroft & Mermin (Rs. 420), Quantum Electrodynamics by Richard Feynman (Rs. 266), and Electromagnetic Systems and Radiating Waves by Jordan & Balmain (Rs. 207) were handy viaducts. Not like there weren’t any terrors in between, such as Lecture Notes on Elementary Topology and Geometryby Singer & Thorpe.
At the same time, exotic discoveries were being made: at particle colliders, optical research facilities, within deep space by ground-based interstellar probes, within the minds of souls more curious than mine. Good for me, the literature corresponding to all these discoveries was to be found in one place: the arXiv pre-print servers (the access to which costs all of nothing). These discoveries included quantum teleportation, room-temperature superconductivity, supercomputers, metamaterials, and advancements in ferromagnetic storage systems.
(I also was responsible for discovering some phenomena exotic purely to me in this period: cellular automata and computation theory – which I experimented with using Golly and Mirek’s Cellebration, and fuzzy logic systems and their application in robotics – experimented with using the Microsoft Robotics Developer Studio.)
What did these discoveries have to do with Hume’s positivism? That I could stuff 1 gigabyte’s worth of data within an inch-long row of particles championed empiricism, I suppose, but beyond that, the concepts’ marriage seemed to demand the inception of a swath of interdisciplinary thought. I could not go back, however, so I ploughed on.
A Brief History of Time (Rs. 245) did not help – Hawking succeeded splendidly in leaving me with more questions than answers – (Gravitation and Cosmology: Principles and Applications of the General Theory of Relativity by Steven Weinberg (Rs. 525) answered some of them), The Language Instinct by Harvard-boy Steven Pinker (Rs. 450) charted the better courses of rationality into sociology and anthropology (whereas my intuition that Arundhati Roy would reward governance with a similar fashion of rational unknotting was proved expensively very right: Algebra of Infinite Justice, at Rs. 302, lays bare all the paradoxes that make India India).
For literature, of course, there were Orhan Pamuk and Umberto Eco, Lord Tennyson and Sylvia Plath, de Beauvoir, le Guin and Abbott (My Name is Red (Rs. … Whatever, it doesn’t matter!), The Name of the Rose, and The Mysterious Flame of Queen Loanaare to be cherished, especially the last for its non-linear narration and the strange parallels waiting to be drawn with hermeneutics, such as one delineated on by E.H. Carr in his What Is History?) to fall in love with (Plath’s works, of course, were an excursion into the unexplored… in a manner of speaking, just as le Guin’s imagination and Abbott’s commentary are labours unto the familiar).
Learn to like ebooks. Or turn poor.
Ultimately, that was all that I learnt. Quite romantic though that being an autodidact may sound, the assumption of its mantle involves the Herculean task of braiding all that one learns into a single spine of knowledge. The more you learn, the farther you are from where you started, the even more you have learnt, the more ambitious you get… I cannot foresee an end.
Currently, I am reading One Day in the Life of Ivan Denisovich by Soviet-era exile Alexander Solzhenitsyn (war-time dystopian fiction became a favourite along the way after reading a history of firearms in Russia, a history of science and technology in Islam, How Things Work gifted to me by my father when I was 11, and Science and Civilisation in China by Needham & Gwei-Djen (Rs. 6,374 – OK, now it matters)) and Current Trends in Science: Platinum Jubilee Edition – Indian Academy of Sciences, lent to me by Dr. G. Baskaran. At each stage, a lesson to be learnt about the universe is learnt, a minuscule piece told in the guise of one author’s experiences and deductions to fit into a supermassive framework of information that has to be used by another’s intelligence. A daunting task.
Incorrigible, indeterminable, the stately constant walks alone: there are none to surmount her prevalence, none in whose company she may sit and chat and sip some tea. Of course, there was e, but e was a few worlds away at 2.71. She was truly by herself, a severe face of changelessness to poets and adventurers, a reassuring one of constancy to mathematicians and thinkers, a staid figure in an arena of labouring laws.
According to those loyal to her, however, the most beautiful of the daughters of Nature.
Today is World Pi Approximation Day. It’s a day celebrating the value 22/7, which is as close as a crass fraction can get to beauty, a purveyor of simulacra, vile manufacturer of nostalgia, of polyurethane histories and plastic memories. Pi… cannot ever be fully understood, and isn’t meant to be. She walks in quiet grace, abandoning perfect arcs meandering in her wake, and it is there that mere mortals such as ourselves discover her shadow seeping into the infinite omnipotence.
The history of Gotham city is not unlike many American cities’ during British colonial rule. It was founded in 1635 by a Norwegian mercenary and was later taken over by the British, changing hands various times over the years. According to Alan Moore, the famous cartoonist and creator of such titles as Watchmen and V for Vendetta, Gotham city was the place of many mysterious occult rites during the American Revolutionary War (Swamp Thing #53).
A separate history was provided for by Bill Willingham (Shadowpact #5): an evil warlock has slept for 40,000 years under the place where Gotham city is built, with his servant Strega claiming the “dark and often cursed character” of the city was inherited from the warlock’s nature. Going by either story, the city assumes a post-Apocalyptic mood that is also Gothic at the same time, and accords it an ambivalence that invites literary exploitation.
This mood has since been open for modification by writers, more so after the chain of events set off by the villain Ra’as Al Ghul. He introduced a virus called the Clench, impacting the city greatly. Just as it was recuperating from its impact, it was hit by an earthquake measuring 7.6 on the Richter scale, prompting the federal government of the United States to cut off Gotham from the mainland because it had no hopes of rehabilitating it. However, respite arrived in the form of assistance from the brilliant billionaire Alexander “Lex” Luthor, Superman’s archenemy.
In this regard, there are many comparisons to be made to Mumbai, which is itself a set of seven islands, is constantly assaulted by terrorists, and often finds support not from the government but from unexpected quarters (but, it must be said, not as unexpected as Luthor). By extension, the residents of Gotham city are also likely to be more resilient and resourceful than the residents of other cities, and possibly quite cynical, too.
Everything about Gotham city is rooted in its mysticism-ridden history, and the fights fought between the region’s native tribes and evil powers. The first signs of modern civilization arise in the 19th century when, after the tribes’ abandonment due to infestation by what they claimed were evil powers, Gotham Town was born as a reputable port.
Around the same period, in 1799, Darius Wayne profited from his labours on the port and started the construction of Wayne Manor, one of the precursors of the city’s cocktail of Gothic, Art Nouveau and Art Deco architectures. The manor itself is what one would call “stately”. It is located toward the northeast of the city, removed from the clamour of urbanism and allowing Batman, or Bruce Wayne – Darius Wayne’s descendent – to plan his adventures in peace.
Exclusivity v. Justice
The isolation of the manor parallels the isolation of Wayne’s personality from that of Batman’s: the former is portrayed as a dilettante indulging in the wealth of his forefathers whereas the latter is portrayed as a vigilante that the city seems to subconsciously need. At the same time, however, it is hard to say what the difference might have been had Wayne Manor been situated inside the city. In this regard, there is a notion of social exclusivity in terms of spaces occupied within the city.
A good case in point for this would be the older part of Gotham, which is situated to the north of the city and generally considered a part of the city itself. Old Gotham is where Crime Alley (which includes the Bowery, the worst neighbourhood in all of Gotham), Arkham Asylum (albeit as an island – visible to the east of a forked New Trigate Bridge), and Amusement Mile (the stalking grounds of the Joker) are located. Therefore, the new city, developing on the principles of reformation and citizen-vigilantism, grew southward and away from its traditional centres of trade, finance, and commerce.
Disregarding the depiction of Gotham’s architecture in the Burton and Nolan movies and the TV series: another of Wayne’s ancestors, Judge Solomon Wayne, was, according to Moore, the inspiration for the city’s unique architecture. Solomon’s intention to reform the city and rebrand it, so to speak, resulting in his commissioning of the young architect Cyrus Pinkney to design and construct the city’s financial centre. Moore’s choice of this explanation coincides perfectly with the period of Gothic Revivalism (around the early 1990s).
Growth v. Justice
Justice within a city is not administered in a court of law nor does it arise out of the adherence to rules and ethics. It is a product of many of the city’s provisions, their accessibility, and how well they work together to give rise to a sense of social security and provide a livelihood. For instance, Gotham’s common man could be working a nine-to-five day-job at some company in One Gotham Centre, just down the road from Wayne Tower, living in the suburbs around the Knights Dome Sporting Complex, within swimming distance of Cape Carmine off Old Gotham, and supporting a family of three.
However, this is not social justice. The need for social justice arises when aspirations, income and social liberty don’t coincide: if the nearest amusement park is haunted by a psychopathic serial killer, if a trip to the airport requires a drive through Arkham Asylum, if affordable housing comes at the price of personal security, and, most importantly, if there is the persistent knowledge of the need for a masked vigilante to rely on for a measurable sense of appeal against all the odds – in simple terms. It is as if the city was carefully misplanned: the Gotham city everyman is someone forced to live in a dangerous neighbourhood because of lack of other options for sheltering.
In other words, social justice is a perfect city and, therefore, by definition, can be neither omnipresent nor omnipotent, especially since Gotham city falls under the umbra of laissez faire economics. As a corollary, to understand social justice within a city, we must understand where the city’s priorities lie. How has the city been developing in the last few years? Is economic equality rising or falling? Who within the city has a sense of ease of access when it comes to valuable resources and who doesn’t?
The problem with studying Gotham city is that it is a city conceived as a negative space to serve as the battleground where the forces of good and evil meet. It has deliberately been envisioned as a child of the industrial revolution entombed within walls of steel and stone, overwhelming those living within it with by the enforcement of a systematic way of life that allows for the exercise of few liberties. This is what effectively paints the picture of Gotham city being a failed one. In fact, this very way of thinking is paralleled in the image of the Metropolis in Blade Runner (1982), whose Modern-expressionism production design was borrowed inefficiently by Barbara Ling for Joel Schumacher’s Batman Forever (1995) to imply a wildly whimsical side to the city. Anyway, this is how we understand the need for Batman, and how that need has been and is created.
It begins with the blighting of the police force: the superhero can become a societal fixture only if there is something fundamentally wrong with the one other body that is responsible for keeping crime in check. The Gotham City Police Department (GCPD) was corrupt for a long time, especially under the leadership of Commissioner Gillian Loeb, who had his hands in the pockets of the Falcone, Galante and Maroni crime families amongst others. The social scene inspired by such a network could be compared to the conspiratorial mood in the movie L. A. Confidential (1997).
By the time Commissioner James Gordon took over after Loeb’s successor Jack Grogan, the GCPD was overridden with lawlessness. Because of such a poor tradition, public authorities who should have been present to assuage the suffering of the historically discriminated were instead present to exacerbate, and profit from, the discrimination. Seeing that the GCPD couldn’t be cleaned from the inside, Gordon enlisted the skills of Batman, a veritable outsider, a deus ex machina.
Once the cleansing was complete, the city could formally begin on its path of reformation. Here is where the question of economic equality arises: when weeding out criminals, did the police department assume a rehabilitative approach or a retributive one? If the movies and TV series based on the comic may be trusted, then retribution was the order of the day, perhaps born out of an urgent need to do away with everything that has plagued the city and start anew.
At the same time, retribution also implies that enforcers of the law – and Batman – were willing to show no patience toward how the city itself was creating many criminals. This lack of patience is also reflected in many of the urban development projects undertaken by the city’s planning commission, especially such ill-conceived ones as the Underground Highway, as if the officials decided that desperate measures were necessary. (The ultimately-abandoned Underground Highway later went on to become the hideout of Killer Croc, apart from becoming the home for many of the city’s homeless – an indication that the forces of corruption at work were creating poverty.)
It can be deduced from all these threads that Gotham city is not simply a product of its history, which continues to influence the way outsiders think of it, but also its inability to cope with what it is fast becoming: a kennel for superheroes to flourish in. There are many decisions at work in the city that collude to create injustice in many forms, and the most significant ones are geographic exclusivity, a retributive mindset in the ranks of the executive, restriction on the exercising of social liberties based on past mistakes, and the presence of Batman himself.
This update is 6 days old, but it hasn’t made any more sense with time. Perhaps it was the way it was written – my opinion: the stress on the financial benefits of offsetting local plutonium storage with monetary compensation is alarming. That Germany will pay the UK to store this ridiculously dangerous material, that the UK will risk political backlash because the “financial benefits from the title transfer will exceed the long-term costs of the material’s safe storage and management”, that France will then supply processed MOX fuel for use in German reactors, that the UK will then argue that it is glad it has been spared the trouble of shipping plutonium while implying that it is comfortable being the site of nuclear waste storage… are all alarming developments.
Why? Because, even though I’m pro-nuclear, the backlash that could arise out of this could negate years of progress in developing MOX-processing technologies and installing them in the middle of energy policies of three countries. One problem is already obviously foreseeable: Germany’s reluctance to continue its reliance on nuclear power is simply short-sighted. If it requires any more power in the future, it will have to purchase it from France, which, given the knee-jerk shutdown of NPPs worldwide after the Fukushima Incident, is just as surprisingly displaying enough sense to rely on NPPs. By then, I hope monetary advantages will not suffice to mask the reality that Germany would be paying to have France purchase its troubles. Unless, of course, there is some other agreeable form of risk-transfer.
The way ahead for particle physics seems dully lit after CERN’s fourth-of-July firecracker. The Higgs announcement got everyone in the physics community excited – and spurred a frenzied submission of pre-prints all rushing to explain the particle’s properties. However, that excitement quickly died out after ICHEP ’12 was presented with nothing significant, even with anything a fraction as significant as the ATLAS/CMS results.
Even so, I suppose we must wait at least another 3 months before a a conclusive Higgs-centric theory emerges that completely integrates the Higgs mechanism with the extant Standard Model.
The spotting of the elusive boson – or an impostor – closes a decades-old chapter in particle physics, but does almost nothing in pointing the way ahead apart from verifying the process of mass-formation. Even theoretically, the presence of SM quadratic divergences in the mass of the Higgs boson prove a resilient barrier to correct. How the Higgs field will be used as a tool in detecting other particles and the properties of other entities is altogether unclear.
The tricky part lies in working out the intricacies of the hypotheses that promise to point the way ahead. The most dominant amongst them is supersymmetry (SUSY). In fact, hints of existence of supersymmetric partners were recorded when the LHCb detector at the LHC spotted evidence of CP-violation in muon-decay events (the latter at 3.9σ). At the same time, the physicists I’m in touch with at IMS point out that rigid restrictions have been instituted on the discovery of sfermions and bosinos.
The energies at which these partners could be found are beyond those achievable by the LHC, let alone the luminosity. More, any favourable-looking ATLAS/CMS SUSY-results – which are simply interpretations of strange events – are definitely applicable only in narrow and very special scenarios. Such a condition is inadmissible when we’re actually in the hunt for frameworks that could explain grander phenomena. Like the link itself says,
“The searches leave little room for SUSY inside the reach of the existing data.”
Despite this bleak outlook, there is still a possibility that SUSY may stand verified in the future. Right now: “Could SUSY be masked behind general gauge mediation, R-parity violation or gauge-mediated SUSY-breaking” is the question (gauge-mediated SUSY-breaking (GMSB) is when some hidden sector breaks SUSY and communicates the products to the SM via messenger fields). Also, ZEUS/DESY results (generated by e-p DIS studies) are currently being interpreted.
However, everyone knows that between now and a future that contains a verified-SUSY, hundreds of financial appeals stand in the way. 😀 This is a typical time of slowdown – a time we must use for open-minded hypothesizing, discussion, careful verification, and, importantly, honest correction.
If publishers could never imagine that there are people who could teach themselves particle physics, why conceive cheaper preliminary textbooks and ridiculously expensive advanced textbooks? Learning vector physics for classical mechanics costs Rs. 245 while progressing then to analytical mechanics involves an incurrence of Rs. 4,520. Does the cost barrier exist because the knowledge is more specialized? If this is the case, then such books should have become cheaper over time. They have not: Analytical Mechanics, which a good friend recommended, has stayed in the vicinity of $75 for the last three years (now, it’s $78.67 for the original paperback and $43 for a used one). This is just a handy example. There are a host of textbooks that detail concepts in advanced physics and cost a fortune: all you have to do is look for those that contain “hadron”, “accelerator”, “QCD”, etc., in their titles.
Getting to a place in time where a student is capable of understanding these subjects is cheap. In other words, the cost of aspirations is low while the price of execution is prohibitive.
Sure, alternatives exist, such as libraries and university archives. However, that misses the point: it seems the costs of the books are higher to prevent their ubiquitous consumption. No other reason seems evident, although I am loth to reach this conclusion. If you, the publisher, want me to read such books only in universities, then you are effectively requiring me to either abstain from reading these books irrespective of my interests if my professional interests reside elsewhere or depend on universities and university-publisher relationships for my progress in advanced physics, not myself. The resulting gap between the layman and the specialist eventually evades spanning, leading to ridiculous results such as not understanding the “God” in “God particle” to questioning the necessity of the LHC without quite understanding what it does and how that helps mankind.