[ad_1]
1.
After reviewing the readings in this module, I have noted three topics that I found very significant from the readings. First was the Ryan (Ryan & Bernard, 2003)article on the techniques to identify themes. This article provided a clear summary of the different approaches when coding qualitative data and an excellent decision tree when deciding on the approach. After reviewing the techniques, I see the importance of aligning the themes to the research questions identified in the research protocol. Finally, looking back at how to triangulate, I can see the power of bringing together all the qualitative points via coding to identify themes to the research question.
The second topic is the notion of collecting and interpreting data. The Gagnon (Gagnon, 2010)readings were very detailed and insightful. Interestingly, case study analysis is viewed as more of an art than science. Looking for contradictory assessments of the data is logical, and being self-aware of counter-evidence is essential, given the diversity of data that can be acquired from the study. Given the amount of data that can be collected in a case study, focusing on software as an analytical aid is critical. I cannot imagine how data from case studies were analyzed before computer software. I see the benefits, but additional time to review and learn the different software options.
The third topic I thought was significant from the readings was the videos. I found the videos interesting as they gave some specific examples of how to code along with examples which made the idea more vivid for me. In addition, I thought the different perspectives considered when coding, along with Dr. Adu’s decision tree to choose the best coding approach, was excellent. After completing the readings, it is clear there are many approaches to coding that the researcher must select and is specific to the study.
Regarding areas that need further attention, I found the readings on the analytical techniques to code and identify themes insightful but hard to follow. The Ryan (Ryan & Bernard, 2003)article provided the most depth, as did Dr. Adu’s video, but I feel some additional readings would help better understand the techniques and how best to apply them. Coding is an exciting idea, but I think the readings could better show examples of the process.
Triangulation makes more sense to me now after reviewing how the various sources of information can be consolidated via the coding process. The Farquhar (Farquhar, 2012) article had an interesting visual on triangulation. The visual deserves more attention via an example, but it has helped me understand how to connect the data throughout the process.
Lastly, I sometimes found the Johnson (Johnson et al., 2006) article confusing. This article was another example of philosophical views that I found hard to follow. Some supplementary readings with further examples would help close this gap.
References
Farquhar, J. D. (2012). Case study research for business [Book]. SAGE Publications.
Gagnon, Y.-Chantal. (2010). The case study as research method : a practical handbook [Book]. Presses de l’Université du Québec.
Johnson, P., Buehring, A., Cassell, C., & Symon, G. (2006). Evaluating qualitative management research: Towards a contingent criteriology [Article]. International Journal of Management Reviews : IJMR, 8(3), 131–156. https://doi.org/10.1111/j.1468-2370.2006.00124.x
Ryan, G. W., & Bernard, H. R. (2003). Techniques to Identify Themes. Field Methods, 15, 109–185.
2.
The three key ideas found to be most significant presented in this module’s readings are coding, themes, and conceptual framework. Coding is the process of recognizing a phrase in the text, then researching and discovering the concepts and the relationships amongst them. This qualitative data analysis strategy allows researchers to identify related content across all their data, resulting in the appearance of themes. How a researcher decides to code should be driven by their methodology. Depending on the methodology of the research, the coding scheme can be deductive, where previous research is applied from their data. When developing codes entirely from the data and ignoring previous knowledge of the studied topic, the inductive method is used. Many coding schemes can fall in between the two approaches. How the coding is reported aligns with the chosen methodology. Some methodologies may call for consistent application of coding to report the counts and reliability of how often codes are in the data. Others use codes to create a description of an experience while bypassing any indication of how precisely and often the code was applied.
For business research, themes are defined as to what extent the identified codes are related to each other. Thematic analysis emphasizes identifying, analyzing, and interpreting patterns and meanings within the qualitative data collected. This method emphasizes both description and organization of the data set and interpretation of the meaning. Thematic analysis looks further into the text beyond counting words and phrases to explore implicit and explicit connotations. The flexibility of this method is on display in regards to research questions, research design, and framing theory. It can be used for exploring questions about the subjects’ perspectives, experiences, practices, and behavior that influence the particular study.
Conceptual framework is essentially a plan for putting solutions into action. The framework takes the conclusions of the study and forms them into actionable plans based on the researcher’s knowledge and their observations of the subjects throughout their research. This will include one or more theories in relation to other concepts and findings from the literature. The framework is essential in proving relationships among the ideas presented and how they will form within the research study. Conceptual frameworks are most common in qualitative research since one theory alone is generally unable to fully address the problem under review.
The two analytic techniques that I would like to explore further are Dedoose and ATLAS.ti. These two analytical tools can be utilized to test hypotheses in research studies. Dedoose is a web-based program that helps you organize research data in a wide variety of formats including qualitative data such as text, audio, images, or video; and quantitative data such as spreadsheets, surveys, test scores, ratings or demographics. With the ability to analyze data for mixed methods projects with complex coding, the software looks like a good choice for distributed groups with complicated research projects.
ATLAS.ti is a software for facilitating qualitative data analysis in qualitative, quantitative, and mixed methods research. The software can be used for locating, coding, and recognizing features in unstructured data and can showcase the output through its visualization functions. This program seems to be useful for coding and analyzing transcripts and notes, building literature reviews, creating diagrams, theory building, and visualizing data. A large benefit for student researchers is that both Dedoose and ATLAS.ti are available for free trials in order for the student to determine if they can effectively apply it for their research work.
One particular concept that was difficult to understand in relation to case study analysis is the examination of themes. The whole notion of themes is largely associated with how to code phrases and context within a work. It is not an easy task to comprehend the themes that are best suited for the research. A researcher must employ constant guidance to make sure their project is on a good trajectory. Unfortunately for researchers, explicit descriptions of themes are not usually explained in reports and articles. When embarking on deducing a work’s themes, researchers are at risk of missing nuances in the data. The subjective nature of thematic analysis relies on the judgment of the researcher, so they must carefully reflect on their own interpretations and choices. It is not an easy task to pay close enough attention to the data to ensure that they are not obscuring data or picking up on themes that are not actually present. Researchers may have to go through exercises of splitting up, combining, discarding, or developing new themes if they encounter problems.
References
Yin, R. K. (2017). Case study research and applications: Design and methods (6th ed.). Sage Publications.
Gagnon, Y. (2010). The case study as research method: A practical handbook. Les Presses de l’Université du Québec. Available in the Trident Online Library EBSCO ebook Collection.
Farquhar, J. D. (2012). Case study research for business. SAGE Publications Ltd. Available in the Trident Library SAGE Research Methods database.
3.
The three key ideas most significant from the readings are analyzing, interpreting and coding data. From Stage 6: Analyzing Data, Gagnon (2010) states that the qualitative data used to build the database shall be processed in the data collection stage by going back and forth between three concurrent activities: purging, coding, and analyzing the data.
- Purging the data means ensuring the evidence collected is relevant to the study. For any tools used, the data must be compatible with said tools.
- Coding is the tagging or labeling data of words, phrases, sentences, and paragraphs that describe or relate to categories or concepts in the connected phenomena of interest. There are two types of coding – top-down approach, known as conductive. And bottom-up, also known as the inductive approach. Used mainly in education and cognitive psychology where concepts are gleaned and establish categories with pre-existing notions (Boje, 1991).
- Content analysis is performed in three stages. The first step is coding the data, the second is classifying the content of the texts, and the third is analysis. The analysis examines range from the coding of the data stage. The technique systematically analyzes data by organizing and integrating information; through quantitative calculations. This analysis can also reveal contradictory conclusions (Gagnon, 2010).
The two analytical techniques that should be further explored and discussed are generating proposed explanations and comparing the suggested answers that pass the evidence test with the existing literature. Both methods are part of Stage 7: Interpreting Data, authored by Gagnon (2010).
- Possible explanations can be generated by reviewing the research question from the prep stage conclusions inspired by the “original explanatory hypotheses. The other approach is by application of creativity and intuition. This method requires the researcher to search for evidence for new meanings, establish casual relationships, and measuring the qualitative data (Yin, 2003).
- Comparing the proposed explanations’ purpose is to contribute to the theory by identifying and analyzing differences between the proposed or existing ideas. There must also be no alternative explanations for this phenomenon. Data should be tested, and if the data passes, that data should enhance the proposed theoretical framework (Eisendhardt, 1989).
The concept that was initially difficult as the data coding platform “Dedoose.” The first assumption was the coding was too complex like IT software coding. However, after additional research, it provides other data findings using analytical techniques/QDA. It may just be that the platform is time-consuming, even though the aim will give additional meaning to the qualitative data.
Eisenhardt, K. (1989). Politics of strategic decision-making in highly velocity environments toward a mid-
range theory. Academy of Management Journal, 31, 737-750.
Gagnon, Y. (2010). The case study as research method: A practical handbook. Les Presses de l’Université
du Québec.
Yin, R. K. (2017). Case study research and applications: Design and methods (6th ed.). Sage Publications.
4.
There were three key ideas present in the reading: data analysis, data interpretation, and research quality. Data analysis is one of the most significant steps in the research, as it is making use of all the data which was painstakingly gathered. However, it also presents an area of great risk in the research. There are continuous steps in the data analysis process; ongoing adjustment of the data being reviewed is needed, as some data must be purged due to things learned from other components of the analysis and the same holds true with changing coding as the process evolves (Gagnon, 2010). It is also important to be able to fully analyze all data, versus excluding some of it due to the overwhelming amount available, which also aids in data triangulation (Farquhar, 2012). The analysis phase can lead to great findings, but must be done thoroughly and also relies on the quality of the research up to that point.
The next main item was the interpretation of the data. This is the final step in which great findings or mistakes can be made. After having data analyzed and patterns determined, what all of the patterns mean are put up for data interpretation. The proposed explanations for the data being the way it is are critical; they are more art than science and require creativity and instinct (Gagnon, 2010). After that proposed explanation is developed, the data needs to be fully checked against that explanation and ensure it is supported, along with the comparison against current literature and understanding (Gagnon, 2010). Exploring all of the data and potential explanations and sharing details with the audience is also important in interpreting the data, to demonstrate that it is not being cherry-picked (Beverland&Lindgreen, 2010).
The final key item is that of quality control. A study must be performed well from start to finish for it to be useful to anyone. A well-defined protocol from the start is needed to ensure that all steps are done well (Beverland&Lindgreen, 2010). Documentation during all phases and proper organization also ensures that nothing is missed during the process (Gagnon, 2010). It is also important to have the utmost credibility, which can be provided by following established research methodologies, thorough details on existing knowledge, data sources, and other things; this will allow for the audience to feel comfortable surrounding the research (Farquhar, 2012).
As far as analytical techniques to explore further, Dedoose would be on the top of the list. Software can be extremely helpful in analysis and interpretation activities, but can also be a hindrance or cause unexpected and improper results if used improperly. The second item is the relationship between coding and theory building. There exists plenty of opportunity to tie them together and not allow each other to get in the way (Gagnon, 2010). The manual coding mixed with software-assisted coding is also helpful in that relationship.
For an item which was more difficult to understand were thoughts presented by Easton (2010) regarding having to justify case studies and the relationship with critical realism. Many of the components regarding detractors of case studies were referenced and make sense, as does the fact that critical realism provides a certain lens in understanding the complexities of organizations and entities. Reviewing that reading required taking a few steps back at certain times and approach the presented material with a more open mind and not considering other learned case study information. Ultimately, the article was helpful and makes many good points.
Beverland, M., &Lindgreen, A. (2010). What makes a good case study? A positivist review of
qualitative case research published in industrial marketing management, 1971–
- Industrial Marketing Management, 39(1), 56-63.
https://doi.org/10.1016/j.indmarman.2008.09.005
Easton, G. (2010). Critical realism in case study research. Industrial Marketing
Management, 39(1), 118-128. https://doi.org/10.1016/j.indmarman.2008.06.004
Farquhar, J. D. (2012). Case study research for business. SAGE.
https://doi.org/10.4135/9781446287910
Gagnon, Y. (2010). The case study as research method: A practical handbook. Les Presses de
du Québec.
[ad_2]