Topic outline

  • About This Course

  • Assignments

    • Test Quiz
      Restricted Available until 8 November 2023, 1:15 PM
  • Just to Warm Up

    Just to recall:

    Reference: https://www.aesanetwork.org/research-onion-a-systematic-approach-to-designing-research-methodology/

    Blog 132-Research Onion: A Systematic Approach to Designing Research Methodology.

    August 27, 2020


    More Science Philosophies that inform research and data analysis

    Major philosophies of science

    Rationalism

    Rationalism is reason, especially upon the Euclidian geometric system, deduced from abstract innate ideas or prior knowledge (a priori), independent of sensory experience (O’Hear, 1989; Popper, 1974; The American Heritage Dictionary of the English Language, 2000). Rational applications in science include the use of equations or models to predict phenomena.

     Many theorists such as Dede, Salzman, Loftin, & Ash (1997), Stratford (1997: 4), Sanders (2002), and Sanders & Khanyane (2002) have noted that models form the basis of teaching science, and I think that models can translate into mental schema or abstractions, if learners have an understanding of the elements from which a model is constructed. This kind of modelling the world for learners and challenging them to extrapolate such models is arguably characteristic of rationalism.

    One of the problems with rationalism is that results of an experiment might not fit prediction.

     

    Empiricism

    In contrast to rationalist innate knowledge, empiricism assumes that brains are blank (tabula rasa) until they are exposed to some experiences (posteriori) (Medawar, 1969: 27; The American Heritage Dictionary of the English Language, 2000). Empiricism recommends experiments and testing of hypotheses (Popper cited in Kuhn, 1974: 800), and extrapolation of data into laws or principles of science (O’Hear, 1989) such as Ohm’s Law. Deriving, verifying, or proving empirical relationships in school experiments are assumed to be doing science.

     The problem with empiricism is establishing the certainty of a phenomenon, or the accuracy of observations and measurements. Empiricists do not account for fallibility; that is, how senses lead us to error (Medawar, 1969: 31).

     

    Positivism

    A seeming extension to empiricism, positivism (or logical empiricism) relies on precise, certain and objective measurements, and rejects subjectivity and human ideology, history and intervention or intuition (Stockman, 1983: 30; Trochim, 2002). According to Comte (the founder of positivism), constructively valuable knowledge is based on facts, which arise from useful, certain, and precise data (Stockman, 1983: 30). Such preciseness is achievable if we assume that objects exist independent of any subject, such that the goal of science is simply to describe the objects through experience and or observation of phenomena (Trochim, 2002). That is, knowledge that cannot be observed is not scientific. Positivism is therefore hostile towards religion and metaphysics (because these are immeasurable). Hence, positivists emphasise the externality of the laws of nature to the observer: knowledge is out there and is perceived the same way by every careful observer (O’Hear, 1989: 14 -19). We achieve objectivity by continued observations, bearing in mind that these may turn out wrong data (Scheffler, 1967: 1).

     

    Objectivism in positivism

    Objectivism assumes that there can be consensus on findings and meanings through observations. It relies on data, which cannot be doubted (Stockman, 1983: 30, 38), through triangulation, verification, control, impartiality, sampling a reasonable proportion of a population, and statistical wizardry (Scheffler, 1967: 2; O’Hear, 1989: 6). Objectivism relies on spatial conceptual frameworks (Schlick, 1925: 530 – 531). Thus, the essential properties of objects are knowable and relatively unchanging, and the world is real, and can be modelled for the learner (hence, the physical and mathematical models for science concepts).

     

    Other positivisms practiced

    Positivisms are not clearly demarcated from each other. I deal with those that seem to influence South African science classrooms.

     a.         Logical positivism

    Logical positivism is attractive in as far as it dilutes the importance of objectivism. Logical Positivism draws rationalism (logic, instinct, etc) and empiricism together (Rosenblueth, 1970: 4; Medawar, 1969: 28), and earns the claim that science is ‘hypothetical-deductive’ (Wellington, 1994: 24). Unlike radical positivism, logical positivists recognise the limitations of human beings to the extent that the scope and acquisition of knowledge is limited by scientists, rather than physical reality, and by previous knowledge and rules, which are used to interpret experiences (Cobern, 1996: 302-303). However, findings are never final and are always subject to question and doubt (Einstein, 1940: 253) such that one part of the theory is logical, while the other part is empirical (Spector, 1965: 44 – 45).

     b.         Post-positivism

    There is an apparent similarity between Logical Positivism and Post-positivism – both uphold hypotheses and deductions as important parts of science. Except, Post-positivists believe that human fallibility can be minimised through repeated experiments (Trochim, 2002). In other words, objectivity is increased by triangulation (O’Hear, 1989: 61-62; Feigl, 1949: 11-12). According to Troachim, one of the common forms of post-positivism is critical realism, which takes observation as fallible and theory as revisable. A critical realist is critical of the means by which logical positivists obtain knowledge to the extent that scientists persistently try to get to the truth. Thus, critical realists try to be objective (Bodner, 1986: 874). Even though that goal can never be achieved (Trochim, 2002), the truth or real world exists regardless of our perceptions (Bodner, 1986: 874). Feigl likens the survival of scientific theories through triangulation to the survival of a species in evolution. Thus, objectivity is not the characteristic of an individual; it is inherently a social phenomenon or rather a social construction. Science advances by consensus and revision (Linnerman, Lynch, Kurup, & Bantwini, 2002: 205-210).

     

    Counter arguments against 'normal' philosophies of science

    Arguments against objectivism

    Arguments against objectivism are abundant, and include that no number of experiments or verifications can establish positively a reality, and that multiple observations of the same phenomenon might differ (however slightly), such that some illusive and subjective nature of reality exists (Cobern, 1996: 302; Pajares, 1998; Geelan, 2000).

     Some theorists believe that objectivism survives by 'scientific' models or spatial frameworks and statistics, by which we attempt to model nature (Schlick, 1925: 530 – 531; Scheffler, 1967: 2). Thus, modelling is an important strategy in science teaching. Penner (as cited in Jonassen, Howland, Moore & Marra, 2003: 190) mentions two basic forms of models: physical models, which are visible or concrete, and; conceptual models, which are not visible but thoughts. In this thesis, a model is considered to be a construct that imitates the real concept or natural system and its interactions (Stratford, 1997: 3-4).

    Popper and daring hypotheses

    Popper observed that a verifiable statement (or law) has to be written in absolute terms (Stockman, 1983: 24; Scheffler, 1967: 5). Popper starts his argument by pointing out that an absolutely and irrevocably true statement forbids particular occurrences (Popper, 1974: 962 – 963). I.e., it leaves no room for doubt.

    Popper does not see any science in doing something that will surely happen (i.e., something that can be verified with certainty). Thus, one can argue that Popper does not support verification of or proving science 'facts' or 'truths' as presented in classrooms. Verifying truths requires one to follow prescribed methods without question since a modification of methods can lead to another truth. Prescribed methods are used in classrooms by means of worksheets.

     Instead, Popper advises scientists to try to falsify those 'truths'. That is, Popper demarcates science knowledge from other forms of knowledge, by falsifiability of a hypothesis (Popper, 1974; O’Hear, 1989: 56; Pajares, 1998; Geelan, 2000). An attempt to falsify laws of science opens science to scrutiny and to new ways of doing science. One way of falsifying a theory is by stating hypotheses that are daring in light of what is considered to be the truth (Popper, 1974: 978-984). Popper points out that great scientists stated refutable and falsifiable hypotheses (and many were rejected: E.g., Charles Darwin). In other words, teachers should rather help learners to look for data that falsifies theories than look for data that proves the theories. Alternatively, teachers should relax the rules they give to learners during practical work so that learners come up with their own truths and ways of establishing truths. In this way, Popper's recommendations would fit constructivist approaches.

     

    Kuhn and revolutions of paradigms

    I think that Popper and Kuhn were in agreement in terms of criticising 'normal' science. Popper advised against clinging to truths or to verifying laws, which implied following prescribed methods, and Kuhn believed that these prescriptions presented in form of paradigms can never lead to new knowledge, unless they were violently changed.

    However, Kuhn states that he disagreed with Popper in that Popper substituted verification with falsification (Kuhn, 1974: 799). Verification and falsification are complementary because, according to Kuhn (1974: 813), while verification confirms a theory, falsification serves to show the incompleteness of that theory. Hence, Kuhn instead suggests that advances in science happen from paradigm shifts, rather than falsifications.

     

    Different authors define a paradigm variously:

    ... achievement that for a time provide model problems and solutions to a community of practitioners (Shepere, 1984: 37); … a collection of beliefs shared by scientists; or a set of agreements about how problems are to be solved (Ross, 1999); … a framework within which scientists do their day-to-day work (O’Hear, 1989: 65).

    All these definitions imply that scientists follow certain agreed-upon rules. That is, paradigms arise because scientists form their own social system (Gardner, 1975: xiv), which is governed by a set of rules (O’Hear, 1989: 65). Thus, objectivity is not the characteristic of an individual; it is inherently a social phenomenon or rather a social construction. Normal science advances by consensus and revision (Linnerman et al., 2002: 205-210). The scientific community rejects unconventional methods of obtaining knowledge, and insists that findings build upon established truths. Note for example, how learners who fail to get the correct data are punished with failure and are eventually expelled from science classrooms, instead of finding out how they claim to know what they write on paper.

     However, extreme adherence to paradigms restricts falsification, and pollutes minds and senses with pre-conceptions because paradigms determine acceptable scientific techniques, data analysis, and explanations, and therefore determine data itself (Medawar, 1969: 29). As with verifiable statements, paradigms tend to suppress fundamental novelties (Shepere, 1984: 38), because paradigms condition the theoretical world-view, and observations (O’Hear, 1969, 69). Medawar (1969: 25) concludes that innocent unbiased observation is a myth, and that the intellectual processes during investigations are themselves the grounds that justify knowledge. Suppression creates false peace and order in the scientific community and no new developments happen during such peaceful rule-bound periods (Shepere, 1984: 37).

     For new ground-breaking knowledge, the peaceful interludes are punctuated intellectually by scientific wars or revolutions (Ross, 1977). The revolutions challenge the orderliness imposed by paradigms, especially those that embrace objectivism and verification (Scheffler, 1967: 3). Kuhn described these wars as "the shattering complements to the tradition-bound activity of normal science” (Pajares, 1998). Thus, Kuhnian science does not develop cumulatively because revolutions replace one conceptual worldview by another (Geelan, 2000).

     

    Intuition

    It was mentioned in the above that positivism does not accept knowledge that cannot be proven by scientific methods. One of the dimensions of science that is neglected as a result of that is intuition.

     Popper and Kuhn's arguments arise from the claim that scientists tend to adhere to truths and paradigms. However, Ross (1977) observes that scientists obtain knowledge without strict adherents. This is possible because the boundaries of paradigms are hazy (Shepere, 1984: 40). Among the processes that do not fit in any paradigm is intuition. Creativity is often intuitive and subjective, but science classrooms and assessment do not encourage it.

     Intuition, together with adventure and imagination, is responsible for the development of ideas, new cognitive structures and concepts (Ross, 1977; Adey, 1987: 19; Wellington, 1994: 24), and accounts for achievements that cannot be explained logically or precisely (Ross, 1977; O’Hear, 1989: 10). Furthermore, in contrast to O’Hear's (1989: 10) criticism of Aristotle’s notion of intuition, Medawar (1969: 46 - 55) points out that imaginative or inspirational processes enter into all scientific reasoning at every level, such that inventors acknowledge the role of these processes. The development of the Watson-Crick model of nucleic acids is a good example of intuition (Wade, 2003). However, we teach the Watson-Crick model as if Watson & Crick developed it from logic and prescribed methods alone. Problems can arise when such intuitively developed concepts are taught purely objectively and logically.

    Medawar (1969: 55-57) mentions four forms of intuition that can have implications in analysing a computer game :

    ·                Deductive intuition: perceiving logical implications instantly, and seeing what follows from holding certain views

    ·                Inductive intuition: creativity or discovery, which is thinking up or hitting on a hypothesis

    ·                Instant apprehension of analogy: real or apparent structural similarity between two or more schemes of ideas (wit)

    ·                Experimental flair or insight

     


    Process of Data Analysis

    What is data analysis? Methods, techniques, types & how-to


    4 Types of Data Analytics Every Analyst Should Know-Descriptive,  Diagnostic, Predictive, Prescriptive | by Co-Learner | Co-Learning Lounge |  Medium




    Data analysis is a procedure of investigating, cleaning, transforming, and training of the data with the aim of finding some useful information, recommend conclusions and helps in decision-making.

    Data analytics is utilizing data, machine learning, statistical analysis and computer-based models to get better insight and make better decisions from the data. Analytics is defined as “a process of transforming data into actions through analysis and insight in the context of organisational decision making and problem-solving.”

    Comparison of Data Analysis and Data Analytics

    Key Differences between Data Analysis and Data Analytics

    1. Data analytics is a conventional form of analytics which is used in many ways like health sector, business, telecom, insurance to make decisions from data and perform necessary action on data. Data analysis is a specialized form of data analytics used in businesses and other domain to analyze data and take useful insights from data.
    2. Data analytics consist of data collection and in general inspect the data and it has one or more usage whereas Data analysis consists of defining a data, investigation, cleaning the data by removing Na values or any outlier present in a data, transforming the data to produce a meaningful outcome.
    3. To perform data analytics, one has to learn many tools to perform necessary action on data. To achieve analytics, one must have knowledge of R, Python, SAS, Tableau Public, Apache Spark, Excel and many more. For data analysis, one must have hands-on of tools like Open Refine, KNIME, Rapid Miner, Google Fusion Tables, Tableau Public, Node XL, Wolfram Alpha tools etc.
    4. Data analytics life cycle consists of Business Case Evaluation, Data Identification, Data Acquisition & Filtering, Data Extraction, Data Validation & Cleansing, Data Aggregation & Representation, Data Analysis, Data Visualization, Utilization of Analysis Results. As we know that data analysis is a sub-component of data analytics so data analysis life cycle also comes into analytics part, it consists data gathering, data scrubbing, analysis of data and interprets the data precisely so that you can understand what your data want to say.
    5. Whenever someone wants to find that what will happen next or what is going to be next then we go with data analytics because data analytics helps to predict the future value. Whereas In data analysis, analysis performs on past datasets to understand what happened so far from data. Data analytics and data analysis both are necessary to understand the data one can be useful for estimating future demands and other is important for performing some analysis on data to look into past.


    Advanced Data Analysis (ADA) is a class in statistical methodology: its aim is to get students to understand something of the range of modern methods of data analysis, and of the considerations which go into choosing the right method for the job at hand (rather than distorting the problem to fit the methods you happen to know). 

    In this course, statistical theory is kept to a minimum, and largely introduced as needed. Since ADA is also a class in data analysis, there are a lot of assignments in which large, real data sets are analyzed with the new methods.

    Of course, there is no way to cover every important topic for data analysis in just a semester. Much of what’s not here, such as  sampling theory and survey methods, experimental design, advanced multivariate methods, hierarchical models, the intricacies of categorical data, graphics, data mining, spatial and spatial-temporal statistics, should have been covered by other Bachelors and Masters courses. 

    Other important areas, like networks, inverse problems, advanced model selection or robust estimation, are included.

    Books (both are online and free)
  • Lecturer: Prof. Muwanga-Zake
    Mobile/ WhatsApp +256 788485749