Individual Americans met here in NZ are virtuous, but the country as a whole isn’t. They are voracious in the way they vastly consume and pollute beyond their share. They are violent; with any reflection vitiate all around them, they construct a war, war on poverty, cancer, drugs, terrorism, to no effect. They are vain, ventose, and vacuous in their belief that they are ‘special’ while revealing their stupidity and ignorance. Their business model is based on vice and corruption being vital for the rich because for them, money is all. They are vandals because they use bombs and violence before diplomacy. When countries are victorious over US invasions and domination they are vindictive, vengeful, and will pursue a vendetta for ever. In their way of communicating they are vulgar and vitriolic. Their use of torture is vile. They are venal in the way they do not keep promises and treaties. They would rather pay more for private business services than vindicate a public service available to all. We must avoid being a vassal and give them a valediction and go our own way with independent valour.
In the ideology of capitalism and the free market idea, it is seen as economically justified and efficient provided that external costs be absent or made an internalised cost on the enterprise. There must be no externalities. The externalised costs of the small tinkers, tailors, cobblers, etc were never great. A bit of local social pressure could take care of it. But nowadays the externalised costs on the public who are not benefitting from the industry can be enormous. This is so when we have such large and remote corporations having such a large influence on economics. We only have to think of the Bhopal disaster and the evasion of responsibility to recognise this. In obvious doubtful situations we are encouraged to take a precautionary approach. I see a need for the public that might be on the receiving end of significant externalities to have some influence in avoiding it. I conceive of an option a bit like legal injunctions but without the high legal cost. People should be able to put the business that is seen to avoid taking care of such externality on a formal notice that they are responsible for the foreseen problem. Then the managers will not be able to avoid being labeled as reckless and be individually responsible for damaging consequences. There should then be no escape for being responsible for the consequences.
Most teachers of economics only concern themselves with capitalism as the economic model of interest. They support with enthusiasm the form generally known as neo-liberal economics. The features include the worship of what they call the free market. It includes the idea that in the name of freedom there should be minimalist government and deregulation. The privatisation of government assets is the most important way of making progress because the belief that free markets always result in more efficiency in business than state run activities.
Economists consider themselves as scientists, but that is false. They do not use data that they ought to establish, but depend on assumptions and assertions. The idea of markets that reach an equilibrium where supply and demand naturally meet, is claimed to give economic stability, yet the experience of booms and bust belie that prediction. This economic hypothesis is falsified convincingly by recent experience. The orthodox economists did not foresee the global financial crisis of 2008.
The ideology that desires minimum government demands tax cuts so the state is forced to reduce social support. The propaganda that the retention of money by capitalists will mean that they will create jobs so that wealth will trickle down to the working non-capitalists. This has not happened, falsifying that hypothesis. The ability of capital to borrow more capital is a positive feedback mechanism that causes inequality to grow in favour of those who have capital. Mathematically, the most efficient distribution of wealth and income is an egalitarian one.
That de-regulation is the harbinger of freedom is a noxious idea. Time and again de-regulation has resulted in deaths and injury to workers and the decrease in financial ethics. This is another falsification for those who think scientifically.
The idea that privatisation will bring efficiency to industrial and social activities is another hypothesis that is falsified. Many times the state must intervene in some way from failures of this hypothesis.
There is a significant faith in the idea that by businesses acting selfishly all will benefit (the trickle down idiocy again). Time and again the selfish actions harm others. The adulteration of food and the production shortcuts have not benefitted the consumer with safety and product reliability. Where public assets are used for profit, then the assets are over-exploited and everyone loses (the Tragedy of the Commons).
Humans are naturally social beings. By working together for a common outcome, advantages for everyone accrue. Competition as seen as good ignoring that it tends to produce more losers than beneficiaries. The goal of competition is to win by taking over the losers to create a monopoly and have a free rain to exploit that position. Co-operation bypasses the need to create losers. True co-operation benefits all.
Many activities of the state have a social purpose. The direction of these activities on a commercial basis has proven many failures. Rather than put business graduates in charge of these operations, we need people whose careers have been in progressing the social intentions to advance into leadership. It is nonsense to put some fancy economist in charge of education for instance.
Economics theory in textbooks is conducted on the basis of nice graphs. They are usually not given quantity graduation on the axes. Often a table of economic relations is used which are not based on actual measurements but imagined up to show a theoretical point. The supply ‘curve’ is an example. Often a nice straight line that sometimes intersects the origin of the graph and even occasionally indicates a quantity of supply when the price is zero. The graph shows a relation that the price is determined by a fixed amount plus a factor multiplied by the quantity of supply. Reverse engineering this, the total cost of production is based on the price multiplied by the quantity of production so it becomes the sum of the fixed amount multiplied by the quantity of supply and the production factor multiplied by the square of the quantity of supply. That the cost of supply depends on the square of the supply quantity seems nonsensical.
A different graph is presented in the chapters on marginal costing in competitive markets with a curve that is close to a parabola of average costs price against quantity. When I reverse engineer to get the total cost of production the first part of the downward part of the curve gives a cost curve close to reality with the cost being the sum of a fixed cost and a variable cost multiplied by the quantity of production. But then reverse engineering the rest of the upward trending curve gives a sharply rising variable cost. One graph I have just reverse engineered shows the variable cost rising to about eight times the initial variable cost. As the material resource and services cost part of the variable costs should not change, there is an exaggeration of the labour part of the variable costs. Economic textbooks depend on believing that the productivity of labour declines at scale but surely not by a factor over eight.
The cost structure of producers is something that a scientist could obtain but economists depend on assumptions rather than a scientific approach.
I have a thought that economics is more like a religion than a science.
Most economic textbooks take on a similar form. They start with an introduction that proposes a “production possibility” advantage to an economy. They make the assumption with two alternatives of production graphed with a convex curve and assume a better result than just specialising in either one. They justify this with an assertion that there are diminishing returns to labour and therefore avoiding specialising stops this.
They overlook the efficiency gains of large capital investments when specialising. Capital costs of manufacturing plant usually follow a power law which means the capital costs do not expand at the same rate as the plant capacity expands. In chemical plant the index of the power law varies. For aluminium production plant the power index is 0.80 meaning that to double the production capacity, the plant will cost 75% more. A caustic soda manufacturing plant will cost only 30% more for a plant of twice the capacity. Other chemical plants are in between these extremes. Doubling the plant size does not always mean doubling the labour force.
Do economists exaggerate the diminishing returns to labour? They depend on assumptions rather than collecting data. They never seem to know that in some cases larger material costs can receive a discounted cost. My thought is that they do not have experimental backing for their assertions. And this is only an introductory assertion. If economists are going to pretend to be scientific they had better do more measurements, stop assuming, and test their theories with real data.
When it comes to trade they produce the idea of “comparative advantage” which supports specialisation rather than the production possibility theory contradicting this phoney theory.
Using the same computer and the same program and the same data there was a different result. How can that be? Was I responsible?
It was the IBM Shell computer which I had just updated the IBM operating system as bugs in the data-base processing had been fixed up. Shell had a big computer with 64K while Caltex and BP had only 32k. That meant that Shell had the responsibility for doing the refinery planning using IBM’s Linear Programming Package which required at least 44k. The oil companies shared the one oil refinery. The package would find the optimum output for the refinery operation given a number of technical constraints.
The constraints could be represented by a polygon in many dimensions in which the answer would reside at an extremity to maximize the profitability of the output. The program would climb up the edges of the imaginary multi-dimensional polygon choosing the best route until it could go no further giving the best result while just staying inside the limiting polygon. A lot of number crunching is involved best described as matrix calculations. When doing divisions in a computer the results are usually imprecise as the results get truncated losing a little precision on the way. The program kept storing up a list of numbers called eta-vectors. These would be used in a matrix re-inversion process to catch up the lost precision. Because the new operating system took up more memory this matrix re-inversion happened at a different stage and a different choice was made in the edge of the polygon to follow and thus end up at a different answer. Nobody in NZ was specialised enough to understand all this but the Shell headquarters in The Hague promptly telexed up the explanation. In the original printout there was an A which was disregarded. It was a message to say that there was a valid alternative solution which we uncovered in the second run of the optimization. Panic over! I have kept the telex as a souvenir. I think this something to remember with computer modeling, precision can be lost.
The Christian God is usually claimed to be a loving God which is concerned with people, their morals and welfare. Since we do not really know the mind of God, we must consider alternative hypotheses. The God, if one really exists, could be completely indifferent as to the welfare of the residents of our little planet. Or it could be the opposite to the Christian God concerned with the good of people and actually be evil and play games with us and amuse itself with human disasters.
In the earthquake in Lisbon in 1531 it was the devoutly faithful that were killed by the church collapses. If the idea of a God being able to affect events is true then the killing of the faithful would be done by an evil God. Lightning used to kill a lot of church bell-ringers. Earthquakes, floods, landslides, and other natural disasters had caused the premature deaths of many people, good and bad alike.
As an atheist I have survived trying to breath water, being washed away in a river, dodged an avalanche by a few meters, falling into a crevasse, being entombed while the oxygen was being depleted by a primus, while caught in a fatal avalanche and yet avoided an opportunity of a God to punish me. Considering this as evidence, the Good God hypothesis does not get supported and the other hypotheses seem more likely.
The first idea of a big bomb appeared in a novel written by a NZ ex-prime minister, Julius Vogel. His novel “Anno Domini” or “Woman’s Destiny.” published in 1889 is a novel that makes many extraordinary predictions about the future including a “means of unleashing a cataclysmic explosion.” Vogel predicted: the internet, jet aircraft, women Prime Ministers, etc.
H G Wells in his novel “The World Set Free.” written in 1913 and published in 1914 was based on a prediction of nuclear weapons of a more destructive and uncontrollable sort than the world has yet seen. Wells’s “atomic bombs” have no more force than ordinary high explosive. They consist of “lumps of pure Carolinum” that induce “a blazing continual explosion” whose half-life is seventeen days, so that it is “never entirely exhausted,” so that “to this day the battle-fields and bomb fields of that frantic time in human history are sprinkled with radiant matter, and so centres of inconvenient rays.” The novel is dedicated “To Frederick Soddy’s Interpretation of Radium,” a volume published in 1909. Soddy was a chemist assistant to Rutherford in Canada when Rutherford developed his theory of transmutation that got him a Nobel Prize. Soddy got a Nobel Prize for his work on isotopes. Soddy is remembered for some books on economics which makes him an economist who got a proper Nobel Prize.
Most chemical explosives are some form of chemical nitrates. Sodium or potassium nitrates can be used to supply the oxygen source in gunpowder. There is ammonium-nitrate, nitrocellulose, nitroglycerine, trinitrotoluene (TNT), cyclotrimethylenetrinitramine (RDX), pentaerythritol tetranitrate (PETN), and so on. These materials explode at different rates. RDX has an velocity of exploding of 8,750 m/s while ammonium-nitrate has a velocity of only 5,270 m/s.
The Monroe effect which is created by concave spaces in the explosive material is an important factor in the use of explosives for demolitions as the explosive force becomes focussed. It is the reason that explosives can puncture amour plate by the use of a conical space that focusses a plasma jet.
The biggest chemical explosion ever was at Halifax, Nova Scotia, Canada, on the morning of 6 December 1917. SS Mont-Blanc, a French cargo ship laden with high explosives, collided with the Norwegian vessel resulting in a fire which further caused an explosion rated at about 2.9 kt of TNT. The Richmond area of the city was destroyed killing 2000 people.
Wells’s novel may even have influenced the development of nuclear weapons, as the physicist Leó Szilárd read the book in 1932, the same year the neutron was discovered by Chadwick who was a co-worker of Rutherford (Rutherford predicted the existence of the neutron) and got a Nobel Prize also.
Szilard was a Hungarian refuge helped by Rutherford’s Academic Assistance Council which was set to find jobs for Jews fleeing Europe. In London, where Southampton Row passes Russell Square, across from the British Museum in Bloomsbury, Leo Szilard waited irritably one gray Depression morning for the stoplight to change. A trace of rain had fallen during the night; Tuesday, September 12th, 1933, dawned cool, humid and dull. Drizzling rain would begin again in early afternoon. He often walked to think. In any case another destination intervened. The stoplight changed to green. Szilard stepped off the curb. As he crossed the street, time cracked open before him and he saw a way to the future, death into the world and all our woes, the shape of things to come. Szilárd conceived the idea of neutron chain reaction. An atom fissioned by neutron particles might break apart producing even more neutrons. He filed for patents on it in 1934. Szilard read an article in The Times summarizing a speech given by Lord Rutherford in which Rutherford rejected the feasibility of using atomic energy for practical purposes. The speech remarked specifically on the recent 1932 work of his students, John Cockcroft and Ernest Walton, in “splitting” the lithium atom into alpha particles, by bombardment with protons from a particle accelerator they had constructed.
Rutherford went on to say: “We might in these processes obtain very much more energy than the proton supplied, but on the average we could not expect to obtain energy in this way. It was a very poor and inefficient way of producing energy, and anyone who looked for a source of power in the transformation of the atoms was talking moonshine. But the subject was scientifically interesting because it gave insight into the atoms.”
Szilard went to see Rutherford. It is recorded that Rutherford “threw him out” although this probably means that Rutherford did not support his idea. Rutherford is quoted as saying “Fortunately at the present time we had not found out a method of so dealing with these forces, and personally I am very hopeful we should not discover it until man was living at peace with his neighbour.”.
In the beginning of 1939, Niels Bohr brought news to New York of the discovery of nuclear fission where neutron impacts produced more neutrons in Germany by Otto Hahn and Fritz Strassmann, and its theoretical explanation by Lise Meitner, and Otto Frisch. When Szilard found out about it on a visit to Eugene Wigner at Princeton University, he immediately realized that uranium might be the element capable of sustaining a chain reaction.
Szilard drafted a confidential letter to the President, Franklin D. Roosevelt, explaining the possibility of nuclear weapons, warning of the German nuclear weapon project, and encouraging the development of a program that could result in their creation. With the help of Wigner and Edward Teller, he approached his old friend and collaborator Einstein in August 1939, and convinced Einstein to sign the letter, lending his fame to the proposal. That famous letter initiated the Manhattan Project which developed two types of atomic bombs.
On 9 October 1941, President Roosevelt approved the atomic program after he convened a meeting with Vannevar Bush and Vice President Henry A. Wallace. The Manhattan Project got really underway under Major General Leslie Groves of the U.S. Army Corps of Engineers in 1942. Unfortunately this was the same time that Hitler decided not to pursue a German program. There was no intelligence about the German program just assumptions.
Enrico Fermi who was responsible for the first atomic pile determined that a fissioning uranium atom produced 1.73 neutrons on average. Because the central nucleus is a very small proportion of an atom’s space, neutrons can easily escape any action in small volumes of uranium. Rutherford described the nucleus as like a fly in the cathedral (St Pauls in London) with the electrons taking up the most of the space. A volume in which the production of neutrons balances the neutrons initiating fissions plus the neutrons escaping is called a critical mass.
This led to one type of bomb in which two sub-critical shapes of uranium are shot together to form a critical mass to explode. 64kg of enriched uranium was used. A cylinder of uranium smaller than the critical mass is shot down to a set of hollow uranium cylinders to create a cylinder of solid uranium above the critical mass. In a small shape neutrons could escape before hitting enough uranium nuclei to maintain fission but when the two parts come together more neutrons can expand the fission. This became the design of the “little boy” nuclear weapon used on Hiroshima on August 6th 1945. The energy released was about equivalent to 13 Kilo tons of TNT through the conversion of about 0.6 grams of matter. Since this idea was found many new devices have been constructed.
The Nagasaki bomb was the type where a sphere of fissile plutonium is compressed by shaped explosives to form a critical mass that explodes. This is the basis of many bomb developments. It used to create a fusion bomb to act as a trigger for the H-bombs.
I think that these weapons are so terrible that it is time to stop and dismantle them and have a nuclear free world. Having these weapons of mass destruction is a sign of a psychopathic state.
The growing stress shown in new jobs and its effect on peoples health is something to be concerned about. Some people are expected to work for long hours, under stress, and sometimes in dangerous conditions. Robots are taking over human work but that should not mean that people should be treated as machines. What is needed, is the requirement that work conditions and expectations should be determined by experts in human physiology alone.
Sleep is a human necessity so experts in human physiology should tell us how many hours we should work. This should take into account what kind of work is done. They must know how much sleep we need and when. They should know what harm is done by interrupted sleep and so set rules for having time for sleep. They should be able to determine how often we need to have a food and toilet breaks. I think that they could have an opinion as to the limit to how many days should be worked before a break.
They will know what degree of lighting levels are required for various types of work. As well as the light levels needed for safe human working, they should be able to tell us what temperature range is suitable and how clean the air must be, free from dust and chemicals. They must have a view on what weight lifting limits for people should be specified and how much physical work would make up a full day.
Some modern jobs like working at a computer need considering the danger of RSI and the need to be mobile, away from the computer from time to time as we know that sitting un-moving is bad for us. Physiologists should be the arbiters of such conditions. Work that requires a high level of concentration like pilots, traffic controllers, and drivers can need more frequent breaks and spells to limit concentrating work. It is physiologists that should determine that. The science of human physiology should be the arbiter of work conditions and hours.
Work at night when the body is at its lowest should be managed according to what the physiologists determine is reasonable. They should decide how changing shifts should consider human needs. While they are about making work rules, perhaps they should specify at what stage people performing different types of work should retire. Could they decide when people should stop work for their health?
Am I alone in my thoughts?
Peter LaRuffa, who is one of the staff at the Grace Fellowship Church (USA) has said “If, somewhere within the Bible, I were to find a passage that said 2+2=5, I would believe it, accept it as true and then do my best to work it out and understand it.” If we counted 1,2,3,5 or 1,2,4,5 then the proposition would be true, but we count (in English) 1,2,3,4,5 etc. Ancient hunter/gathering people counted one, two, two-one, two-two, two-two-one or one, two, three, one-three, two-three, three-three. But the Pastor is putting his faith in something known to be wrong. That means we can not accept anything that Pastor says. Faith is the denial of observation so that a blind belief can be preserved.
I was sent to Bible classes but I must have seen through it early on because my parents dropped my attendance. So I have not been indoctrinated and lack much knowledge of Biblical content. I note that some clergy who read the Bible thoroughly are turned off and become atheistic. But with my poor knowledge of it, I know the Bible is wrong.
For someone who evaded the propaganda when young, there are many ridiculous things in the bible. A human formed from clay? Or a human (Lot’s wife) transmuting in the other way to another type of atom. Any change in the constituent atoms of the clay would not go to the majority of the elements in a human body. A person formed out of a human rib? A talking snake which doesn’t have vocal chords. No wonder that there are so many humourists and stand up comics that can have so much fun with religion. The Bible is a big resource for something to ridicule.
The earth does not contain enough water to cover the whole earth. That story about Noah is full of holes concerning how big the arc was and how it could be made to hold and feed animals for a long time. Still, there is geological evidence that a flood could have occurred to fill the Mediterranean and the Black Sea which happened so long ago that the truth could get corrupted. The original Bible has passed through many translations and copying before printing was invented that it cannot be trusted to be perfectly accurate. People with a good knowledge of the Bible have analyzed it and found such a considerable number of contradictions that it can only be treated as ancient stories written by people who had very limited knowledge or understanding of what was real. In mathematics a proposition can be proved to be false by the existence of contradictions.
A man swallowed by a fish and not suffocate? That is not possible. A person going away for 40 days without food and water could not survive. A human could not live for 900 years. Maybe it was 900 months as I am over 900 months age myself. Then Jesus Christ (if he really existed) did some magic tricks which have been done by modern magicians. I think the Bible is purely mythological.