Research design and philosophy

Punch (1998: 66) defines ‘research design’ as situating the researcher in the empirical world, and connecting the research questions to data. Undertaking a social evaluation of such magnitude requires thorough consideration of various research paradigms as well as ontology and epistemology which provide understanding of the social world and consider the perception of and assumptions about reality thus affecting the way this social research is undertaken. Furthermore, exactly these aspects of truth perception are crucial to understand and minimise the researcher’s biases and to control congruence of the approaches. James & Vinnicombe (2002) point out that that everybody has common preferences which are likely to develop research designs. At the same time, Blaikie (2000) characterises the mentioned reality perception rather as a series of choices which are made by a researcher like myself and which should be connected to the research question. Inconsistency with this requirement might lead to the use of improper research methodology and thus lack of coherence.

Additional exploration of the above elements provides new perspectives on the research design and philosophy. Whilst Blaikie (1993) argues that they introduce a component of free will which adds intricacies beyond that found in the natural sciences, Hatch & Cunliffe (2006: 51) emphasise their ability to ‘encourage researchers to study phenomena in different ways’. Denzin & Lincoln (2003) and Kvale (1996) go even further by using the aspects to explain considerable tension amongst academics.

Best services for writing your paper according to Trustpilot

Premium Partner
From $18.00 per page
4,8 / 5
4,80
Writers Experience
4,80
Delivery
4,90
Support
4,70
Price
Recommended Service
From $13.90 per page
4,6 / 5
4,70
Writers Experience
4,70
Delivery
4,60
Support
4,60
Price
From $20.00 per page
4,5 / 5
4,80
Writers Experience
4,50
Delivery
4,40
Support
4,10
Price
* All Partners were chosen among 50+ writing services by our Customer Satisfaction Team

The present research will consider the reality and truth perception, as well as belief and assumptions about these is more detail aiming to explain their nature and choose a proper methodological approach to be undertaken. Aiming to demonstrate awareness and understanding, the study will create a solid background to the full research design.

Ontology refers to ‘the science or study of being’ aiming at encompassing ‘claims about what exists, what it looks like, what units make it up and how these units interact with each other’ (Blaikie 1993:3). In simple words, this branch is a science of being that describes one’s worldview and assumptions on the nature of reality, which can be both objective and subjective. A perfect illustration of ontology’s basic idea is provided by Hatch & Cunliffe (2006) who use two different examples: one of a workplace report (an everyday example) and another one used in a social science. While the first one can describe either what is happening in reality or an employee’s opinion about what is happening, the second one explores such abstract phenomena as culture, power or control and discusses the levels of subjective vs. objective perception of these.

The above ontological assumptions inevitably influence the researcher’s viewpoint and reality perception. Therefore, for any type of research it is important to identify and consider these assumptions to remain objective and not to take any phenomenon for granted but critically evaluate and discuss them where necessary.

However, ontology and its ideas lead to and raise another set of important questions. How is the reality measured? What constitutes knowledge of reality? How does one know where the reality is? The answers to these questions are provided by epistemology.

Epistemology

Epistemology accompanies ontology in its attempt to define reality. Easterby-Smith, Thorpe & Jackson (2008) assert that epistemology considers the most appropriate methods of enquiring into our natural world, and Eriksson & Kovalainen (2008: 37) think that it answers the question ‘what is knowledge and what are the sources and limits of knowledge’ and discuss how it defines the ways of producing and arguing for knowledge. De facto, questions of epistemology give serious consideration to the methodology applied to the present social research study.

Epistemology is the branch of philosophy which raises much of the debate about how to gain knowledge of reality, how to know what exists in fact and what does not exist at all, what can be identified as known, and what criteria must be satisfied for such an identification (Blaikie 1993). The most concise definition of epistemology is given by Hatch & Cunliffe (2006: 5) who refer to ‘knowing how you can know’. The authors examine the ways of knowledge generation, representation and description as well as some criteria of knowledge quality control. Chia (2002) discusses the importance of methods and standards of reliable knowledge production. The interdependence between epistemology and ontology is also highlighted (Hatch & Cunliffe 2006).

Taken into consideration the above findings, the position of the researcher requires clear definition as his/her certain ontological position (objective or subjective) or assumptions may influence the epistemological choices (also, objective or subjective) and/or conclusions drawn.

Objective epistemology considers world as external and theory neutral, whereas subjective epistemology does not accept an access to the external world beyond the existing subjective assumptions, observations and interpretations (Eriksson & Kovalainen 2008). Saunders, Lewis & Thornhill (2007) go deeper into discussion and emphasise the importance of data collection sources pertaining: Thus, information from objects existing apart from the researcher (an external reality) is more objective, which means social phenomena should be studied using a statistical, rather than narrative, form to avoid possible biases. Because so many choices are available, the researcher’s values and preferences influencing the process complicate the achievement of true objectivity (Blaikie 1993).

Research Paradigms

The above discussion and argumentation lead to the logical consideration of a so-called ‘research paradigm’ (Blaikie 2000) or ‘research philosophy’ (Saunders, Lewis & Thornhill, 2007) formed from basic ontological and (the related) epistemological positions. The terms aim at effective classifying different research approaches and have developed in both classical and contemporary forms. A research paradigm refers to ‘an interpretive framework’ (Denzin & Lincoln 2003) and is a ‘basic set of beliefs that guides action’ (Guba & Lincoln 1982).

The three basic paradigms prevalent in management research are that of positivist, interpretivist / constructivist, and realist (contemporary). Though often called with different names because of being developed in parallel across different branches of the social sciences, these three approaches effectively form the ‘poles’ from which other paradigms are developed or derived.

Positivist

Having derived from that of natural science, the positivist paradigm separates reality from the knowledge of it (i.e. subject from object) and provides an objective reality against which researchers can compare their claims and ascertain truth. In other words, the positivist approach tests hypotheses that were developed from existing theory through measurement of observable social realities and presumes the social world exists objectively and externally. Only external reality observations are valid, it claims, because there are some general laws and theories that are able to explain everything and help to make predictions.

Being based upon values of reason, truth and validity, positivism focuses on facts exclusively and controls that these are gathered and measured properly – using empirical quantitative methods such as survey and experiments and statistical analysis (Blaikie 1993; Saunders, Lewis & Thornhill 2007; Eriksson & Kovalainen 2008; Easterby-Smith, Thorpe & Jackson 2008; Hatch & Cunliffe 2006). In an organisational context, positivist viewpoint assumes that proper research can be made only by using categorisation and scientific measurement of the behaviour of people and systems and a truly representative language (Hatch & Cunliffe 2006).

Interpretivist / Constructivist

This paradigm is referred to either as anti-positivist (Hatch & Cunliffe 2006) or post-positivist (Blaikie 1993). Intepretivists believe in multiple realities (Denzin & Lincoln 2003) as each individual (and/or a group) interprets situations based on his/her individual experience and expectations. Thus, meaning is being constructed and repeatedly re-constructed. This leads to an endless number of possible interpretations which create a complex, multi-level social reality in which people act.

Interpretivist / constructivist paradigm is based on the assumption that any knowledge is relative (to the individual, e.g. the researcher) and so it is important to discover and consider different meanings and the influence and determinacy factors. This approach is inductive in its core as interpretivists move from specific observations to broader generalisations and theories in their work alongside others (Hatch & Cunliffe 2006).

Saunders, Lewis & Thornhill (2007) notice that the obvious advantage of interpretivist / constructivist perspective is that this method fits well the peculiarities of social world understanding as the meanings and interpretations of its ‘actors’ are highly contextual and so rarely generilisable. Easterby-Smith, Thorpe & Jackson (2008) emphasise another benefit as the subjective nature of this approach allows analysis of such subjective categories as thinking, feeling, language including both verbal and non-verbal communication, etc. Eriksson & Kovalainen (2008) also stress upon the opportunity to study language and the availability of qualitative approaches to data gathering.

However, taken the subjective nature of the approach, there is a risk of misinterpretations and biases. So, the use of self-reflection is advised as a necessary step to avoid these and to remain possibly objective.

Approach for the Present Research

For the present research, the interpretivist approach is chosen for its primary purpose is to study individual perspectives while investigating diversity management as a way to increase productivity within an organisation. This research paradigm allows inquiring thoughts and feelings that exist across organisations as regards different management, leadership and motivation tools and techniques. Interpretivist perspective is a good way to interpret communication and language and during this study will examine the relating findings in the context of the academic literature review, especially taken into consideration that the research does not set out to test any pre-existing theory. Relying upon mixed methods, the chosen approach will allow organisational actors discovering and understanding the individual and shared sense of meaning about the interventions made.

Naturally, the choice of research approach is justified with the nature of the research questions (individual-focused) and their epistemology. The study aims at analysing factors influencing different interpretations gathered and, therefore, will not focus on explaining underlying mechanisms or any causal effects the possibility of which is limited by the interpretivist approach. Inductive rather than deductive, the research will build a theory based on observations made and will be highly contextual and limited to generalisations.

The following figure visualises key aspects of this study (ADD FIGURE PLEASE).

Methodology and Procedure

Mixed Methods

The Mixed Methods (or Dual research) chosen for the present research are defined as a procedure for collecting data by using a ‘mixture’ of quantitative and qualitative data gathering techniques within a single study in order to understand a research problem more completely (Tashakkori & Teddlie 2003; Creswell 2002). When neither of the techniques is sufficient to analyse a complicated issue such as the subject of the present research, a complimentary combination of two can be used for more complete investigation (Green, Caracelli & Graham 1989). Punch (1998) stresses upon the need to use both qualitative and quantitative approaches in order to arrive at ‘an objective, measurable understanding of the phenomenon’ and to achieve ‘a humanistic comprehension of its socio-environmental dimensions’. The mixed methods approach is usually used in more natural (i.e. less controlled) research settings.

In the social sciences, an investigator employing quantitative research relies on numerical data (Charles & Mertler 2002) and so examines quantitative properties and phenomena and their relationships exclusively. Providing a connection between empirical observation and mathematical expression of quantitative relationships, he uses post-positivist approach for developing knowledge. These include cause-and-effect thinking, testing theories and use of measurement. Carefully choosing which variables to investigate and how, quantitative methods isolate these to verify relationship hypotheses and obtain highly reliable results.

Qualitative research, on contrary, refers to “an inquiry process of understanding” during which the researcher develops a “complex, holistic picture, analyses words, reports detailed views of informants, and conducts the study in a natural setting” (Creswell 1998: 15). This approach allows using both constructivist (Guba & Lincoln 1982) and advocacy/ participatory (Mertens 2003) perspectives.

Qualitative data gathering is possible even in everyday life as it uses the values informants perceive and produces ‘an understanding of the problem based on multiple contextual factors’ (Miller 2000).

The Mixed Methods approach allows using pragmatic grounds (Maxcy 2003) and asserts that truth cannot be purely calculated but is rather “what works” in reality (Howe 1988). Logically, such approaches utilises the most appropriate tools for findings answers to the research question (Tashakkori & Teddlie 1998). As pragmatism asserts compatibility of quantitative and qualitative methods, numerical and text data provides input of equal importance and allow better understanding of the issue examined. At that, however, priority, implementation, and integration of data collection and analysis require careful consideration (Creswell et al. 2003). Priority decision refers to whether quantitative or qualitative method will be given priority. Implementation choice explains whether the quantitative and qualitative data collection and analysis will be performed in sequence or simultaneously. Finally, integration decision points the research process phase during which the two methods will be integrated (i.e. mixed).

For the present research, while managing diversity in an organisation, including employee perception of the interventions made, will be analysed qualitatively, some variables (i.e. diversity statistics and productivity ratios) will be analysed quantitatively as a strategy for uncovering the factors influencing the organisation. In other words, the mixed methods (of both qualitative and quantitative) data analysis approaches will be used for the purposes of arriving at a more holistic understanding of the phenomenon. Among the types of the mixed methods, the most popular one of sequential explanatory design is chosen consisting of two distinct stages (Creswell 2002; Creswell et al. 2003). During the first stage, the numeric data will be gathered and analysed aiming to select informants for the second stage. A discriminant function analysis will be performed to identify potential predictive power of selected variables on the employees of the chosen organisations. During the second stage, individual semi-structured interviews will be conducted to collect textual data. Accompanying data gathering techniques include documents and elicitation materials. External and internal factors obtained during the first stage will be checked for prediction of the effectiveness level of managing diversity to increase productivity within an organisation.

The rationale of the above two-stage approach lies in the idea that the quantitative data analysis will provide a general picture of the studied issue, while the qualitative results will examine everything in more detail.

The figure below presents a visual model of the procedures for the sequential explanatory mixed methods design of the present research (ADD FIGURE 2 PLEASE).

The research philosophy and design prioritises the qualitative method because it acquires more data and provides a deeper and a more detailed analysis. Furthermore, qualitative analysis utilises quantitative results by exploring maximal variation cases and attempts at revealing the predicting power of the selected external and internal factors to effective managing diversity to increase productivity within an organisation. The two methods are integrated at the beginning of the qualitative stage during selection of the participants for case study analysis and developing the interview questions based on the results of the statistical tests. The results of the two stages will be also integrated during the discussion of the outcomes of the whole research.

Interviews

Because of being best suited to the case study method, the interview was chosen against the survey as the primary data gathering technique. In spite of wide usage of the interview data collection method, scholars (Gerson & Horwitz 2002; Mouton 2001; Patton 2002; Saunders et al. 2000; Stake 1995; Struwig & Stead 2001; Tellis 1997; Welman & Kruger 1999) have no consensus opinion about its definition and application.

Interview as a method for qualitative research (Denzin & Lincoln 2005) seeks to cover both a factual and a meaning level, though it is usually more difficult to interview on a meaning level (Kvale 1996). The qualitative research interview seeks to explore and interpret relevant themes in the world of the subjects, with the main task being to understand the meaning of what the interviewees say.

In simple words, interviewing is about asking questions and receiving answers. The most common type of interviewing is individual, face-to-face verbal interchange, but it can also take the form of face-to-face group interviewing, mailed or self-administered questionnaires, and telephone surveys (Punch 1998: 175). Greenfield (1996: 75), in pointing out the strength of face-to-face interviewing as a method of data collection, included the following:

face-to-face encounter with informant,

obtains large amounts of expansive and contextual data quickly,

useful for discovering complex interconnections in social relationships,

data are collected in natural setting,

good for obtaining data on non-verbal behaviour and communication,

great utility for uncovering the subjective side, the native’s perspective of organisational processes.

The present research used three different types of interviews, each of which having its advantages and disadvantages:

structured interview,

semi-structured interview, and

unstructured interview.

The structured interview uses closed questions and offers fixed responses from participants, while the semi-structured interview offers free responses and the unstructured interview allows free expression without any restrictions. The more structured interviews allow an easier and quicker analysis, and vice versa. Patton (1990) also distinguishes three main types of interview: the informal conversational interview, the general interview guide approach and the standardised open-ended interview.

The following table summarises the way interviews were conducted for the present research.

Purpose of the interviews

Purpose of the interview (the research question) and the roles of the participants to be briefly explained before the interview

Length of the interviews

The structured interviews are planned to last 40-70 min. The semi-structured and unstructured interviews are planned to last 30-120 min.

Size of interview groups

One-to-one interviews are planned to prevail but multiple participant interviews are planned as well.

Telephone vs. face-to-face interviews

Face-to-face interviews are planned as default. Telephone interviews are planned for situations when a face-to-face meeting is impossible.

Data recording and tracking

The interviews will be captured on a digital voice recorder with use being made of hand-written notes where appropriate. The interview data will be tracked by creating a log about who participated, when and where.

Feedback of interview data

The data gathered during the interviews will be fed back to the participants by individual emails and during suitable meetings for the purpose of gaining an agreement, receiving additional comments, and encouraging further participation.

Data Collection

The case study approach allows various data collection methods to be used, including questionnaires, interviews (structured, semi-structured and unstructured), observation, and secondary data gathering. Gillham (2000), Saunders et al. (2000) and Jankowicz (2000) suggest a multi-method approach to data collection, i.e. utilising more than one method for effective analysis. Powell (1997: 89) particularly emphasises the use of questionnaire, interview and observation as they are “data collection techniques or instruments, not research methodologies, and can be used with more than one methodology.”

The following data collection methods will be used for the present research:

secondary data, including such artefacts as corporate materials, annual reports, company regulations, minutes of meetings, etc.;

face-to-face and telephone (key informant) interviews; and

data gathering through participant observation (e.g. meetings of various types, etc.).

These methods match well Yin’s (1994: 80) study which indentifies six major sources of evidence in case studies, including documents, archival records, interviews, direct observation, participant-observation, and physical artefacts. Each of the data collection methods used for the present research is considered part of the overall research design and philosophy and aims at improving its validity. At that, a so-called triangulation approach will be used to increase the quality of investigations made.

A number of scholars comment upon the use of triangulation (Darke et al. 1998; Easterby-Smith et al. 1991; Gillham 2000; Myers 1997; Patton 2002; Stake 1995; Yin 1994) and advocate its application for the means of research validation and bias avoidance for all possible data sources. Yin (1994: 91) states that, “a major strength of case study data collection is the opportunity to use many different sources of evidence” and Stake (1995: 114) distinguishes among data (sources) triangulation, investigator (observer) triangulation, and methodological (using multiple sample types and sources) triangulation.

The present research will use triangulation as a part of the empirical data collection and utilise its following types:

data triangulation, and

investigator triangulation.

Data triangulation will be applied using multiple sources, including published material of the case study organisations, face-to-face and telephone interviews, meetings and observations. Investigator triangulation will be the researcher’s task that has to ensure the integrity of the quantitative and qualitative data gathering activities.

Data Analysis

The mixed methods application foresees simultaneous data collection and analysis (Merriam 1998). It is planned to use the NVivo 9 software to code and analyse the data obtained during the qualitative data analysis stage, which will follow the below steps:

preliminary exploration of the data by reading through the transcripts and writing memos;

coding the data by segmenting and labelling the text;

using codes to develop themes by aggregating similar codes together;

connecting and interrelating themes; and

constructing a narrative (Creswell 2002: 185).

Further discussion will be encouraged using visual data display presenting the resulting conceptual framework of the factors and relationships in the data (Miles & Huberman 1994).

Each organisational case study will include a detailed description (narration) and will be followed with the data analysis. The case context and setting will be considered while analysing the specific activities and situations involved (Creswell & Maitta 2002; Merriam 1998). The researcher will use either an elaborate perspective about the incidents and chronology, or present major events followed by an up-close description.

As logical for a multiple case study, the analysis will be performed at two levels: (1) within each case and (2) across the cases (Stake, 1995), either as a holistic examination or an embedded analysis of specific factors and/or aspects (Yin 1994). For the present research, first, each organisational case study will be analysed for themes. Then, the themes will be compared for similarities and differences. The outcome of further analysis will show the extent to which the identified internal and external factors have similar or different effect on the study participants as related to their organisations. In the final phase, the researcher will interpret the meaning of the cases and compose an across-case report about the “lessons learned” (Lincoln & Guba 1985).

Research Quality

Research quality comprises the standards for the evaluation of the methodological rigour of the chosen research approach. Applying mixed methods to study diversity management in an organization helps to enrich research findings for a more comprehensive understanding and to enhance the study’s scientific rigour. To ensure the rigour of mixed methods studies on managing diversity to increase productivity in organisations, researchers should justify the approach selection based on research purpose and questions, integrate quantitative and qualitative methods meaningfully and effectively, and maintain methodological congruence.

As a qualitative and a quantitative research method differ in their nature, the criteria for judging each of these differ correspondingly. In qualitative approach, the investigator seeks believability based on coherence, insight, and instrumental utility (Eisner 1991) as well as trustworthiness (Lincoln & Guba 1985) through a process of verification, whilst quantitative approach is usually checked by using traditional validity and reliability measures. Another peculiarity of the qualitative design is that the study cannot be exactly replicated in another context. At the same time, the researcher’s central assumptions, the selection of informants, the biases and values of the researcher – all enhance the study’s chances of being replicated in another setting (Creswell 2003).

A combination of a qualitative and a quantitative design requires special judging. Clark & Creswell (2008: 270) critically asses the ways to evaluate a mixed methods research and list the following evaluation criteria available:

the planning quality of the research (feasibility, transparency),

design quality (detailed description, suitability, strength and rigour),

data quality,

interpretive rigour (relationship of findings to methods, credibility),

inference transferability,

reporting quality,

synthesisability, and

utility.

Confirming that the list is too long, the authors emphasise the use of a limited number of tools. These can be grouped into credibility, reliability and validity factors as most relevant and widespread.

Credibility

To determine the credibility of the research findings, i.e. whether they match reality (Merriam, 1988), four primary forms will be used during the second research stage:

triangulation – converging different sources of information;

member checking – getting the feedback from the participants on the accuracy of the information obtained;

proper description to convey the findings; and

external audit – a thorough review of the report by an outside expert (Creswell 2003; Creswell & Miller 2002).

Reliability

Reliability is the accuracy and precision of a measurement procedure (Thorndike 1997). For the present research, stability reliability and internal consistency reliability check will be applied using the pilot testing of the instrument and repeated administering for the former and correct reflection of attributes check for the latter. Results of the actual survey then will be compared and correlated with the initial results in the pilot study and expressed by the “Pearson r coefficient” (Instrument reliability 2001). Scale item rewording or removal will be applied to the non-reliable items.

Validity

Thorndike (1997) defines validity as ‘the degree to which a study accurately reflects or assesses the specific concept or construct’ that the analyst is trying to measure. The present research will apply content, criterion-related, and construct validity check.

The first validity type will show the extent to which the survey items and the scores from these questions are representative of all the possible questions about ways of diversity management with the aim of productivity increase within an organisational environment. This will allow assessing whether the survey questions are relevant to the subject, whether the data collection approach is reasonable, and whether the overall survey is well-designed for the research purpose.

Criterion-related validity will be applied as a measurement tool for the accuracy of procedures used by comparing these with one another, which has been demonstrated to be valid (Overview: Reliability and Validity 2001). For the present research, the self-designed survey questionnaire will be weighed against existing instruments, measuring the same construct. It is planned to constantly seek for any newly available instruments and study those if found.

To achieve construct validity, which seeks agreement between a theoretical concept and the procedures and analysis tools applied, factor analysis of the survey items will be performed during the second stage of the research. Factor loadings for research items will indicate a correlation between the item and the overall factor (Tabachnick & Fidell 2000). Preferably, the analysis will result in a simple structure having the following characteristics:

each factor should have several variables with strong loadings,

each variable should have a strong loading for only one factor, and

each variable should have a large communality, i. e. degree of shared variance (Kim & Mueller 1978).

Construct validity also concentrates on possibility of ‘the results produced by one’s measuring instrument being able to correlate with other related constructs in the expected manner’ (Carmines & Zeller 1991: 121). The results of the present research will be correlated with the results obtained from similar studies measuring related constructs (e.g. identifying internal and external factors contributing to diversity management to increase productivity in organizations).

Position of the Researcher

Though being personally involved in the research subject (the researcher conducts interviews himself) and interested in a positive outcome (creating a clear link between managing diversity and increasing productivity in an organisation), the researcher attempts to stay neutral during information gathering and its further analysis. The researcher also takes all necessary steps concerning ethical issues of the study and informs the participants about the position from which research is being conducted and explained.

At t

You Might Also Like
x

Hi!
I'm Alejandro!

Would you like to get a custom essay? How about receiving a customized one?

Check it out