Mathematics is considered as the mother of all sciences because it is a tool that solves problems of every other science. Other subjects like biology, Chemistry, or Physics are based on simple chemical solutions. All the activities that are taking place in our body or outside are a chemical reaction.

ResearchGate Logo

Discover the world's research

  • 20+ million members
  • 135+ million publications
  • 700k+ research projects

Join for free

FUNDAMENTALS OF MATHEMATICS Saman Siadati

MATHEMATICS

The word mathematics comes from Ancient Greek μάθημα (máthē ma), meaning

"that which is learnt", "what one gets to know", hence also "study" and "science".

The word for "mathematics" came to have the narrower and more technical

meaning "mathematical study" even in Classical times.

Its adjective is μαθηματικός (math ēmatikós), meaning "related to learning" or

"studious", which likewise further came to mean "mathematical". In particular,

μαθηματικὴ τέχνη (math ēmatik tékhnē), Latin: ars mathematica, meant "the

mathematical art".

FIELDS OF MATHEMATICS

Mathematics can, broadly speaking, be subdivided into the study of quantity,

structure, space, and change (i.e. arithmetic, algebra, geometry, and analysis).

In addition to these main concerns, there are also subdivisions dedicated to

exploring links from the heart of mathematics to other fields: to logic, to set theory

(foundations), to the empirical mathematics of the various sciences (applied

mathematics), and more recently to the rigorous study of uncertainty.

While some areas might seem unrelated, the Langlands program has found

connections between areas previously thought unconnected, such as Galois groups,

Riemann surfaces and number theory.

Discrete mathematics conventionally groups together the fields of mathematics which

study mathematical structures that are fundamentally discrete rather than continuous.

FOUNDATIONS

In order to clarify the foundations of mathematics, the fields of mathematical logic

and set theory were developed.

Mathematical logic includes the mathematical study of logic and the applications of

formal logic to other areas of mathematics; set theory is the branch of mathematics

that studies sets or collections of objects.

QUANTITY

The study of quantity starts with numbers, first the familiar natural numbers and

integers ("whole numbers") and arithmetical operations on them, which are

characterized in arithmetic.

The deeper properties of integers are studied in number theory, from which come

such popular results as Fermat's Last Theorem. The twin prime conjecture and

Goldbach's conjecture are two unsolved problems in number theory.

STRUCTURE

Number theory studies properties of the set of integers that can be expressed in terms of

arithmetic operations. Moreover, it frequently happens that different such structured sets (or

structures) exhibit similar properties, which makes it possible, by a further step of abstraction,

to state axioms for a class of structures, and then study at once the whole class of structures

satisfying these axioms.

Thus one can study groups, rings, fields and other abstract systems; together such studies (for

structures defined by algebraic operations) constitute the domain of abstract algebra.

SPACE

The study of space originates with geometryin particular, Euclidean geometry, which combines space and

numbers, and encompasses the well-known Pythagorean theorem. Trigonometry is the branch of mathematics

that deals with relationships between the sides and the angles of triangles and with the trigonometric functions.

The modern study of space generalizes these ideas to include higher-dimensional geometry, non-Euclidean

geometries (which play a central role in general relativity) and topology.

Quantity and space both play a role in analytic geometry, differential geometry, and algebraic geometry.

Convex and discrete geometry were developed to solve problems in number theory and functional analysis but

now are pursued with an eye on applications in optimization and computer science.

CHANGE

Understanding and describing change is a common theme in the natural sciences, and calculus

was developed as a tool to investigate it. Functions arise here, as a central concept describing a

changing quantity. The rigorous study of real numbers and functions of a real variable is known as

real analysis, with complex analysis the equivalent field for the complex numbers.

Functional analysis focuses attention on (typically infinite-dimensional) spaces of functions. One of

many applications of functional analysis is quantum mechanics. Many problems lead naturally to

relationships between a quantity and its rate of change, and these are studied as differential

equations. Many phenomena in nature can be described by dynamical systems; chaos theory makes

precise the ways in which many of these systems exhibit unpredictable yet still deterministic

behavior.

APPLIED MATHEMATICS

Applied mathematics concerns itself with mathematical methods that are typically used in science,

engineering, business, and industry. Thus, "applied mathematics" is a mathematical science with

specialized knowledge.

The term applied mathematics also describes the professional specialty in which mathematicians

work on practical problems; as a profession focused on practical problems, applied mathematics

focuses on the "formulation, study, and use of mathematical models" in science, engineering, and

other areas of mathematical practice.

CALCULUS PIONEERS : ALHAZEN *

Alhazen (Latinized of Ibn Haytham) (c. 965 c. 1040 ce) was one of the greatest

mathematicians and undoubtedly the best Persian physicist. He derived a formula for the sum of

fourth powers. He used the results to carry out what would now be called an integration of this

function, where the formulae for the sums of integral squares and fourth powers allowed him to

calculate the volume of a paraboloid.

He was a skilled and innovative master in mathematics, arithmetic and geometry, trigonometry,

algebra, algebra, etc. The description of the principles of the dark room and the invention of the

magnifying glass is one of the outstanding works, which led to the creation of the camera.

According to some researchers, Alhazen is the first scientist in the world to calculate the speed of

sound. Alhazen measured the speed of light and the circumference of the earth by the standard

measure of length in his time, which was the unit of motion. He was the fist to study the properties

of light 700 years before Newton. Alhazen is known as the first scientist to use experimental-

based methods in his work.

Alhazen was born in Basra, region that was part of Greater Iran at the time part of the Buyid emirate, and is now in Iraq

NEWTON AND LEIBNIZ

Modern calculus was developed in 17th-century Europe by Isaac

Newton and Gottfried Wilhelm Leibniz (independently of each other, first publishing

around the same time) but elements of it appeared in ancient Greece, then in China

and the Persia (old name of Iran), and still later again in medieval Europe and in

India.

LIMITS AND INFINITESIMALS

Calculus is usually developed by working with very small quantities. Historically, the

first method of doing so was by infinitesimals.

These are objects which can be treated like real numbers but which are, in some

sense, "infinitely small".

For example, an infinitesimal number could be greater than 0, but less than any

number in the sequence 1, 1/2 , 1/3, ... and thus less than any positive real number.

From this point of view, calculus is a collection of techniques for manipulating

infinitesimals. The symbols dx and dy were taken to be infinitesimal, and the

derivative dy /dx was simply their ratio.

LIMITS AND INFINITESIMALS

Calculus is usually developed by working with very small quantities. Historically, the

first method of doing so was by infinitesimals.

The infinitesimal approach fell out of favor in the 19th century because it was

difficult to make the notion of an infinitesimal precise.

However, the concept was revived in the 20th century with the introduction of non-

standard analysis and smooth infinitesimal analysis, which provided solid foundations

for the manipulation of infinitesimals.

In the late 19th century, infinitesimals were replaced within academia by the

epsilon, delta approach to limits.

LIMITS AND INFINITESIMALS

Limits describe the value of a function at a certain input in terms of its values at

nearby inputs.

They capture small-scale behavior in the context of the real number system.

In this treatment, calculus is a collection of techniques for manipulating certain limits.

Infinitesimals get replaced by very small numbers, and the infinitely small behavior

of the function is found by taking the limiting behavior for smaller and smaller

numbers.

Limits were thought to provide a more rigorous foundation for calculus, and for this

reason they became the standard approach during the twentieth century.

DIFFERENTIAL CALCULUS

Differential calculus is the study of the definition, properties, and applications of

the derivative of a function. The process of finding the derivative is called

differentiation.

Given a function and a point in the domain, the derivative at that point is a way of

encoding the small-scale behavior of the function near that point.

By finding the derivative of a function at every point in its domain, it is possible to

produce a new function, called the derivative function or just the derivative of the

original function.

In formal terms, the derivative is a linear operator which takes a function as its

input and produces a second function as its output.

DIFFERENTIAL CALCULUS

This is more abstract than many of the processes studied in elementary

algebra, where functions usually input a number and output another number.

For example, if the doubling function is given the input three, then it outputs

six, and if the squaring function is given the input three, then it outputs nine.

The derivative, however, can take the squaring function as an input.

This means that the derivative takes all the information of the squaring

function such as that two is sent to four, three is sent to nine, four is sent to

sixteen, and so onand uses this information to produce another function.

The function produced by deriving the squaring function turns out to be the

doubling function.

DIFFERENTIAL CALCULUS

In more explicit terms the "doubling function" may be denoted by g(x) = 2x and the

"squaring function" by f(x) = x2.

The "derivative" now takes the function f(x), defined by the expression "x2", as an

input, that is all the informationsuch as that two is sent to four, three is sent to nine, four

is sent to sixteen, and so onand uses this information to output another function, the

function g(x) = 2x, as will turn out.

The most common symbol for a derivative is an apostrophe-like mark called prime. Thus,

the derivative of a function called f is denoted by f , pronounced "f prime". For instance,

if f(x) = x2 is the squaring function, then f (x) = 2x is its derivative (the doubling function

g from above). This notation is known as Lagrange's notation.

If the input of the function represents time, then the derivative represents change with

respect to time. For example, if f is a function that takes a time as input and gives the

position of a ball at that time as output, then the derivative of f is how the position is

changing in time, that is, it is the velocity of the ball.

MATHEMATICAL STATISTICS

Mathematical statistics is a key subset of the discipline of statistics. Statistical

theorists study and improve statistical procedures with mathematics, and statistical

research often raises mathematical questions. Statistical theory relies on probability

and decision theory.

Mathematicians and statisticians like Gauss, Laplace, and C. S. Peirce used decision

theory with probability distributions and loss functions (or utility functions).

The decision-theoretic approach to statistical inference was reinvigorated by

Abraham Wald and his successors, and makes extensive use of scientific computing,

analysis, and optimization; for the design of experiments, statisticians use algebra

and combinatorics.

MATHEMATICAL OPTIMIZATION

Mathematical optimization (alternatively spelt optimisation) or mathematical

programming is the selection of a best element (with regard to some criterion) from

some set of available alternatives.[1]

Optimization problems of sorts arise in all quantitative disciplines from computer

science and engineering to operations research and economics, and the development

of solution methods has been of interest in mathematics for centuries.[2]

In the simplest case, an optimization problem consists of maximizing or minimizing a

real function by systematically choosing input values from within an allowed set and

computing the value of the function. The generalization of optimization theory and

techniques to other formulations constitutes a large area of applied mathematics.

More generally, optimization includes finding "best available" values of some

objective function given a defined domain (or input), including a variety of different

types of objective functions and different types of domains.

NUMERICAL ANALYSIS

The overall goal of numerical analysis is the design and analysis of techniques to

give approximate but accurate solutions to hard problems, the variety of which is

suggested by the following:

oAdvanced numerical methods are essential in making numerical weather prediction feasible.

oComputing the trajectory of a spacecraft requires the accurate numerical solution of a system of

ordinary differential equations.

oCar companies can improve the crash safety of their vehicles by using computer simulations of car

crashes. Such simulations essentially consist of solving partial differential equations numerically.

oHedge funds (private investment funds) use tools from all fields of numerical analysis to attempt to

calculate the value of stocks and derivatives more precisely than other market participants.

oAirlines use sophisticated optimization algorithms to decide ticket prices, airplane and crew

assignments and fuel needs. Historically, such algorithms were developed within the overlapping field

of operations research.

oInsurance companies use numerical programs for actuarial analysis.

GAME THEORY

Game theory is the study of mathematical models of strategic interaction among

rational decision-makers. It has applications in all fields of social science, as well as in

logic, systems science and computer science.

Originally, it addressed zero-sum games, in which each participant's gains or losses

are exactly balanced by those of the other participants. Today, game theory applies

to a wide range of behavioral relations, and is now an umbrella term for the science

of logical decision making in humans, animals, and computers.

Modern game theory began with the idea of mixed-strategy equilibria in two-

person zero-sum games and its proof by John von Neumann. Von Neumann's original

proof used the Brouwer fixed-point theorem on continuous mappings into compact

convex sets, which became a standard method in game theory and mathematical

economics.

PROBABILITY THEORY

Probability theory is the branch of mathematics concerned with probability. Although there

are several different probability interpretations, probability theory treats the concept in a

rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms

formalise probability in terms of a probability space, which assigns a measure taking values

between 0 and 1, termed the probability measure, to a set of outcomes called the sample

space. Any specified subset of these outcomes is called an event.

Central subjects in probability theory include discrete and continuous random variables,

probability distributions, and stochastic processes, which provide mathematical abstractions of

non-deterministic or uncertain processes or measured quantities that may either be single

occurrences or evolve over time in a random fashion.

Although it is not possible to perfectly predict random events, much can be said about their

behavior. Two major results in probability theory describing such behaviour are the law of

large numbers and the central limit theorem.

MATHEMATICAL FINANCE

Mathematical finance, also known as quantitative finance and financial mathematics, is a field of

applied mathematics, concerned with mathematical modeling of financial markets. Generally,

mathematical finance will derive and extend the mathematical or numerical models without

necessarily establishing a link to financial theory, taking observed market prices as input.

Mathematical consistency is required, not compatibility with economic theory. Thus, for example,

while a financial economist might study the structural reasons why a company may have a certain

share price, a financial mathematician may take the share price as a given, and attempt to use

stochastic calculus to obtain the corresponding value of derivatives of the stock

The fundamental theorem of arbitrage-free pricing is one of the key theorems in mathematical

finance, while the BlackScholes equation and formula are amongst the key results.

Mathematical finance also overlaps heavily with the fields of computational finance and financial

engineering. The latter focuses on applications and modeling, often by help of stochastic asset

models, while the former focuses, in addition to analysis, on building tools of implementation for the

models. In general, there exist two separate branches of finance that require advanced

quantitative techniques: derivatives pricing on the one hand, and risk- and portfolio management

on the other.

MATHEMATICAL AND THEORETICAL BIOLOGY

Mathematical and theoretical biology is a branch of biology which employs theoretical

analysis, mathematical models and abstractions of the living organisms to investigate the

principles that govern the structure, development and behavior of the systems, as opposed to

experimental biology which deals with the conduction of experiments to prove and validate

the scientific theories. The field is sometimes called mathematical biology or biomathematics to

stress the mathematical side, or theoretical biology to stress the biological side.

Theoretical biology focuses more on the development of theoretical principles for biology

while mathematical biology focuses on the use of mathematical tools to study biological

systems, even though the two terms are sometimes interchanged.

Mathematical biology aims at the mathematical representation and modeling of biological

processes, using techniques and tools of applied mathematics and it can be useful in both

theoretical and practical research. Describing systems in a quantitative manner means their

behavior can be better simulated, and hence properties can be predicted that might not be

evident to the experimenter. This requires precise mathematical models.

MATHEMATICAL ECONOMICS

Mathematical economics is the application of mathematical methods to represent

theories and analyze problems in economics. By convention, these applied methods

are beyond simple geometry, such as differential and integral calculus, difference

and differential equations, matrix algebra, mathematical programming, and other

computational methods. Proponents of this approach claim that it allows the

formulation of theoretical relationships with rigor, generality, and simplicity.

Mathematics allows economists to form meaningful, testable propositions about

wide-ranging and complex subjects which could less easily be expressed informally.

Further, the language of mathematics allows economists to make specific, positive

claims about controversial or contentious subjects that would be impossible without

mathematics. Much of economic theory is currently presented in terms of mathematical

economic models, a set of stylized and simplified mathematical relationships asserted

to clarify assumptions and implications.

DISCRETE MATHEMATICS

The history of discrete mathematics has involved a number of challenging problems

which have focused attention within areas of the field. In graph theory, much research

was motivated by attempts to prove the four color theorem, first stated in 1852, but

not proved until 1976 (by Kenneth Appel and Wolfgang Haken, using substantial

computer assistance).

In logic, the second problem on David Hilbert's list of open problems presented in

1900 was to prove that the axioms of arithmetic are consistent. Gödel's second

incompleteness theorem, proved in 1931, showed that this was not possible at least

not within arithmetic itself. Hilbert's tenth problem was to determine whether a given

polynomial Diophantine equation with integer coefficients has an integer solution. In

1970, Yuri Matiyasevich proved that this could not be done.

DISCRETE MATHEMATICS

The need to break German codes in World War II led to advances in cryptography and

theoretical computer science, with the first programmable digital electronic computer being

developed at England's Bletchley Park with the guidance of Alan Turing and his seminal work,

On Computable Numbers.

At the same time, military requirements motivated advances in operations research. The Cold

War meant that cryptography remained important, with fundamental advances such as public-

key cryptography being developed in the following decades.

Operations research remained important as a tool in business and project management, with

the critical path method being developed in the 1950s. The telecommunication industry has

also motivated advances in discrete mathematics, particularly in graph theory and information

theory. Formal verification of statements in logic has been necessary for software development

of safety-critical systems, and advances in automated theorem proving have been driven by

this need.

THEORETICAL COMPUTER SCIENCE

Theoretical computer science includes areas of discrete mathematics relevant to

computing. It draws heavily on graph theory and mathematical logic. Included within

theoretical computer science is the study of algorithms and data structures.

Computability studies what can be computed in principle, and has close ties to logic,

while complexity studies the time, space, and other resources taken by computations.

Automata theory and formal language theory are closely related to computability.

Petri nets and process algebras are used to model computer systems, and methods

from discrete mathematics are used in analyzing VLSI electronic circuits.

Computational geometry applies algorithms to geometrical problems, while computer

image analysis applies them to representations of images. Theoretical computer

science also includes the study of various continuous computational topics.

GAME THEORY, DECISION THEORY, UTILITY THEORY, SOCIAL CHOICE THEO RY

Decision theory is concerned with identifying the values, uncertainties and other

issues relevant in a given decision, its rationality, and the resulting optimal decision.

Utility theory is about measures of the relative economic satisfaction from, or

desirability of, consumption of various goods and services.

Social choice theory is about voting. A more puzzle-based approach to voting is

ballot theory.

Game theory deals with situations where success depends on the choices of others,

which makes choosing the best course of action more complex. There are even

continuous games, see differential game. Topics include auction theory and fair

division.

DISCRETE ANALOGUES OF CONTINUOUS MATHEMATICS

There are many concepts in continuous mathematics which have discrete versions, such as

discrete calculus, discrete probability distributions, discrete Fourier transforms, discrete

geometry, discrete logarithms, discrete differential geometry, discrete exterior calculus,

discrete Morse theory, difference equations, discrete dynamical systems, and discrete vector

measures.

In applied mathematics, discrete modelling is the discrete analogue of continuous modelling.

In discrete modelling, discrete formulae are fit to data. A common method in this form of

modelling is to use recurrence relation.

In algebraic geometry, the concept of a curve can be extended to discrete geometries by

taking the spectra of polynomial rings over finite fields to be models of the affine spaces over

that field, and letting subvarieties or spectra of other rings provide the curves that lie in that

space. Although the space in which the curves appear has a finite number of points, the curves

are not so much sets of points as analogues of curves in continuous settings.

REFERENCES

Dun, Liu; Fan, Dainian; Cohen, Robert Sonné (1966). A comparison of Archimdes' and Liu Hui's studies of circles.

Chinese studies in the history and philosophy of science and technology. 130. Springer. p. 279. ISBN 978 -0-7923-

3463-7.,pp. 279ff

André Weil: Number theory: An approach through History from Hammurapi to Legendre. Boston: Birkhauser Boston,

1984, ISBN 0-8176-4565 -9, p. 28.

Wilson, Robin (2002). Four Colors Suffice. London: Penguin Books. ISBN 978-0-691-11533- 7.

Hodges, Andrew (1992). Alan Turing: The Enigma. Random House.

Du, D. Z.; Pardalos, P. M.; Wu, W. (2008). "History of Optimization". In Floudas, C.; Pardalos, P. (eds.). Encyclopedia

of Optimization. Boston: Springer. pp. 1538 1542.

Myerson, Roger B. (1991). Game Theory: Analysis of Conflict, Harvard University Press, p. 1. Chapter-preview links,

Johnson, Tim (September 2009). "What is financial mathematics?". +Plus Magazine.

Chiang, Alpha C.; Kevin Wainwright (2005). Fundamental Methods of Mathematical Economics. McGraw-Hill Irwin.

pp. 34. ISBN 978-0-07-010910-0. TOC.

REFERENCES

Debreu, rard ([1987 ] 2008 ). "mathematical economics", section II, The New Palgrave Dictionary of Economics,

2nd Edition. Abstract. Republished with revisions from 1986, "Theoretic Models: Mathematical Form and Economic

Content", Econometrica, 54 (6), pp. 1259-1270 .

Varian, Hal (1997). "What Use Is Economic Theory?" in A. D'Autume and J. Cartelier, ed., Is Economics Becoming a

Hard Science?, Edward Elgar. Pre-publication PDF.

  • Saman Siadati Saman Siadati

R is a language and environment for statistical computing and graphics. It is a GNU project which is similar to the S language and environment which was developed at Bell Laboratories (formerly AT&T, now Lucent Technologies) by John Chambers and colleagues. R can be considered as a different implementation of S. There are some important differences, but much code written for S runs unaltered under R.

Human resource management, due to environmental fluctuations, such rapid changes in economic, political and technological breadth, and complexity of the competition, significant changes in the nature and composition of the labor force, has faced several challenges. One of the enhancing success factors for organizations is to focus on the task of strategic management of human resources. This is much more in organizations with information and communication technology (ICT) mission. Using efficient and update methodology is evitable with rapid changes in technologies and infrastructure used in projects. ICT organizations encounter with challenges, which affect human resource management paradigm. In this article, the authors used the Strategic Reference Points (SRPs), in order to evaluate appropriate Human Resources strategies.

  • M. Hadi Kabiri M. Hadi Kabiri
  • Parvaneh Kazemi
  • Javad Hoseinpour

In order to achieve its goals, management in the organization seeks to establish systems and systems to ensure a favorable environment for increasing productivity. One of the most important of these systems is the performance appraisal system for employees to study their behaviors and capabilities and its growth and flourishing to achieve the goals of the organization. In IT-based organizations that run software projects, this issue is even more apparent due to the high sensitivity of the work and the high correlation between the quality of the output and the performance of the team members. In reviewing the performance appraisal systems used in various organizations and institutions, it is generally concluded that these institutions and organizations, based on the need and within the existing scientific models, experimentally select and They have established a kind of performance appraisal system for their employees, so that they are similar in many ways, regardless of the circumstances and characteristics of the human resources. In this article, the authors select and evaluate all the methods of performance appraisal and their weaknesses and strengths, taking into account the specific conditions of a software team, in order to select and design an effective performance appraisal model that fits the conditions of a software production team. The self-assessment model is based on five general parameters of individual profile, job description, achievement, goals, and career development.

  • Saman Siadati Saman Siadati

Python is an interpreted, high-level, general-purpose programming language. Created by Guido van Rossum and first released in 1991, Python's design philosophy emphasizes code readability with its notable use of significant whitespace. Its language constructs and object-oriented approach aim to help programmers write clear, logical code for small and large-scale projects.

Experiments on patients, processes or plants all have random error, making statistical methods essential for their efficient design and analysis. This book presents the theory and methods of optimum experimental design, making them available through the use of SAS programs. Little previous statistical knowledge is assumed. The first part of the book stresses the importance of models in the analysis of data and introduces least squares fitting and simple optimum experimental designs. The second part presents a more detailed discussion of the general theory and of a wide variety of experiments. The book stresses the use of SAS to provide hands-on solutions for the construction of designs in both standard and non-standard situations. The mathematical theory of the designs is developed in parallel with their construction in SAS, so providing motivation for the development of the subject. Many chapters cover self-contained topics drawn from science, engineering and pharmaceutical investigations, such as response surface designs, blocking of experiments, designs for mixture experiments and for nonlinear and generalized linear models. Understanding is aided by the provision of "SAS tasks" after most chapters as well as by more traditional exercises and a fully supported website. The authors are leading experts in key fields and this book is ideal for statisticians and scientists in academia, research and the process and pharmaceutical industries.

Keywords See also References