Economic Computation and SFEcon Model 0
2016
Last updated…
69 pages
Sign up for access to the world's latest research
Abstract
Neoclassical theory holds that economic order and stability owe to a spontaneous tendency toward general optimality. Formal descriptions of this tendency are variously subsumed as the ‘economic calculation’ (or ‘socialist computation’ or ‘Vienna’) problem, which is presumably complicated beyond any possibility of solution. This paper, with its related desk-top prototypes, are presented as counterexamples to the non-computability premise. Together they demonstrate a general, international I/O structure’s advance through all the chaotic physical states and disequilibrium prices leading to Pareto optimality. Stated in the familiar terminology of DSGE modeling, the SFEcon algorithm maximizes value subject to the constraint of mass being conserved. Conservation of mass is imposed by loci of technical indifference that are everywhere expressive of diminishing marginal utility. SFEcon models emulate the continuous building-up and working-off of physical assets via ordinary engineering dyna...
Related papers
KSCCN, 2026
This exhaustive analysis presents a mathematical and topological critique of the canonical New Keynesian Dynamic Stochastic General Equilibrium (DSGE) model, arguing its structural insufficiency in the context of the early 2026 macroeconomic metamorphosis driven by agentic Artificial Intelligence (AI), Distributed Ledger Technology (DLT), and radical monetary policy shifts. The DSGE model is rejected because its linear, aggregate equations and reliance on assumptions like the "representative agent" and "rational expectations" systematically obscure the micro-level distortions, specifically the intertemporal misallocation of capital (malinvestment), precipitated by fiat credit expansion. To provide a superior framework, the paper formalizes Austrian economics using advanced mathematics: category theory models subjective action and unemployment as a categorical pullback; differential geometry models the economy as a non-Euclidean Riemannian manifold where central bank intervention injects geometric curvature, causing capital malinvestment via the geodesic deviation equation; and fractal geometry quantifies market volatility and the expansion of wealth inequality via the Cantillon effect. Empirical events, such as the synchronized cross-asset deleveraging during the "Warsh Shock" and the systemic risks introduced by algorithmic monocultures (the "Altruist Event"), validate the Austrian framework and demonstrate the curative nature of the ensuing market bust (modeled as Ricci flow). The paper concludes that sound money and non-interventionist policy are a strict mathematical imperative to maintain economic stability, preserve decentralized entrepreneurial discovery, and resist the centralized surveillance inherent in Central Bank Digital Currencies (CBDCs), which represent a technological reinforcement of the fatally flawed DSGE central planning apparatus.
RePEc: Research Papers in Economics, 2010
or see . The ASSRU logo depicts a Counting table (woodcut probably from Strasbourg). The spaces between the lines function as the wires on an abacus. The place value is marked at the end. ♥ Forthcoming in: The Elgar Companion to Recent Economic Methodology, edited by John Davis & Wade Hands, Edward Elgar Publishing, Cheltenham, Glos., & Northampton, MA, (2011). We are greatly indebted to the Editors for the kind invitation to contribute and the immense patience with which they tolerated the various ways in which we transcended generous deadlines. The title has metamorphosed into the ultra-simple final form it has taken, having begun its life as Computational Economics, become the Computational Paradigm in Economics, then Computational Economics, Computable General Equilibrium Theory & Computable Economics and, finally, Classical Behavioural Economics, Computable General Equilibrium Theory, Computable Economics and Agent-Based Computational Economics. Each of the transitional titles seemed, at least to the authors, of emphasizing particular kinds of ways the notion of machine computation, and its underpinning theory, were implemented in a variety of economic theories. To avoid any such connotation it seemed best to choose as neutral a title as possible, without losing focus on the main theme which is, of course, the foundations of the methodology of computing in economics. We are deeply indebted to our two graduate students, Selda Kao and V. Ragupathy, for invaluable logistical and intellectual help. Alas, they refuse to take any blame for the remaining infelicities.
2013
Catastrophe theory and deterministic chaos constitute basic elements of the science of complexity. Elementary catastrophes were the first form of nonlinear, topological complexity that were seriously studied in economics. Deterministic chaos and other types of complexity succeeded catastrophe theory. In general, chaos means the seemingly random behavior of a deterministic system, which stems from high sensitivity to its initial conditions. Nonlinear dynamical systems theory, which unites various manifestations of complexity into one integrated system, is contrary to the assumptions that markets and economies spontaneously strive for a state of equilibrium. To the contrary, their complexity seems to grow due to the influence of classic economic laws. In my paper, I indicate that with time, model economic systems strive for a state we call "the edge of chaos". I consider two cases. The first case concerns an economy based on a two-stage accelerator, where the economic cycle ...
2015
Introduction Right after the transmission of the Nobel Prize award to the economist Amartya Sen, in 1998, a lot of journal’s articles and others periodicals have contributed to spread out his fame as ‘sociologist of poverty’, ‘economist of poverty’; a leftist periodical has insinuated that such a prize owed to two North-Americans (R.C.Merton and M.S.Scholes) awarded a year before whose studies were based on differentials equations to prospect the operations on stock market but their own enterprise (and many others) was bankrupted when the crisis erupted in Asia some months later. That means they developed something that themselves could not use; it had not even private utility. Anyway, the economic analysis based on utility only (one of the two aspect composing the commodity) is extremely pernicious. Celso Furtado, a Brazilian economist and contemporary to Amartya Sen at the post-graduation course in Cambridge, comments that Sen – On ethics & economics – “intend to fly very high bec...
2016
This paper examines critically the contributions of Cournot, Jevons and Walras as the founders of classical mathematical economics from a methodological standpoint. Advances in different economic schools and doctrines in the 19th century produced an environment of multi-dimensionality in economic analysis which was regarded by the pioneers of classical mathematical economists as a chaotic state. We have demonstrated that the formation of this new discipline, known equivalently as pure or scientific economics, was a response to this so-called chaotic state. We have also shown that the erroneous logic of abstraction in the sense of reducing a multi-dimensional economic system to a one-dimensional mechanical framework as the methodological basis of classical mathematical economics has been the origin of serious shortcomings in mathematical treatment of economics. Based on the writings of Jevons and Walras we have provided evidences to support the claim that advances in Marxian economic...
2013
The role of mathematics in modern economics has been a topic of periodic dispute, which took on a new life with accusations concerning the limitations of mathematical models after the global crash of 2008. This article adds a historical dimension by considering some key past debates over the use of mathematics, including important statements by Alfred Marshall and John Maynard Keynes. It is proposed that the complexities of economic systems and of human motivation do not themselves constitute arguments against the use of mathematics, but they should affect the kinds of mathematical approaches employed and the purposes to which it is oriented. Given this complexity, mathematics is less useful as a predictive tool and more useful for heuristic purposes. Economists should also pay attention to guiding metaphors and analogies that guide the uses of particular kinds of mathematics.
International Journal of Unconventional Computing, 2012
2003
In human societies diverse people act purposively with powerful but limited cognitive processes, interacting directly with one another through technologically-facilitated and physically-mediated social networks. Agent-based computational modeling takes these features of humanity-behavioral heterogeneity, bounded rationality, network interactions-at face value, using modern objectoriented programming techniques to create agent populations that have a high degree of verisimilitude with actual populations. This contrasts with mathematical social science, where fantastic assumptions render models so cartoon-like as to beg credibility-stipulations like identical agents (or a single 'representative' agent), omniscient agents (who accurately speculate about other agents), Nash equilibrium (macro-equilibrium arising from agent-level equilibrium) and even the denial of direct agent-agent interaction (as in general equilibrium theory, where individuals interact only with a metaphorical auctioneer). There is a close connection between agent computing in the positive social sciences and distributed computation in computer science, in which individual processors have heterogeneous information that they compute with and then communicate to other processors. Successful distributed computation yields coherent computation across processors. When such distributed computations are executed by distinct software objects instead of physical processors we have distributed artificial intelligence. When the actions of each object can be interpreted as in its 'self interest' we then have multi-agent systems, an emerging sub-field of computer science. Viewing human society as a large-scale distributed system for the production of individual welfare leads naturally to agent computing. Indeed, it is argued that agents are the only way for social scientists to effectively harness exponential growth in computational capabilities.
A new concept of equilibrium, attempting to overwrite general equilibrium, into a new unified or general framework of economic theory.
Computations, Approximations and Simulations are the three pillars on which an epistemology of economic theory should be founded. To these must be added methodologically rigorous mathematical notions of constructivity (Brouwer), solvability (Turing) and decidability (Gödel), to make economics meaningfully applicable. It is not by chance that we never entered Cantor's Paradise, despite temptations to do so. Finally, from a philosophical point of view, we feel that anchorings in intuitionism (Kant), phenomenology (Husserl) and madhyamakism (Nagarjuna) would help return economics to the moral sciences fold, which it never should have left.

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Jack Ramsay