Pages

Sunday, November 14, 2010

Conclusions and Further Research

Image retrieved from http://www.shutterstock.com/

General Conclusion
Overall, using intensive reading to decode information allows learners to develop conceptual knowledge. This process is mediated by language and by the strategies learners and teachers choose to approach a text and scaffold the learning process, respectively.  The CLIL framework provides the content teacher with key guiding principles that favor a) student-centered learning environments, b) the use of language strategies such as intensive reading, eliciting key vocabulary, and scanning and skimming, and c) the development of high and low order thinking skills. Finally, the students improve their language ability as a result of using the language in meaningful contexts.

Specific Conclusions
1) Reading needs to be approached as an interactive process that is embedded into meaningful activities addressed to challenge thinking.

2) Measuring reading implies a comprehensive view not only of reading but also of measuring.  Different types of test should be combined to guarantee that judgments about students’ reading comprehension are valid and reliable.

3) Students’ training on new strategies to organize information, understand a paragraph or summarize key information, is required if students are to be successful.

4) Students’ are better able to use the language as a result of taking into account both language goals and content goals when planning the lessons. The strategies used for the purpose of scaffolding the students’ language process were eliciting key vocabulary, reading with a purpose in mind, taking notes, and using graphic organizers to account for science concepts.

FURTHER RESEARCH
  • What are the effects of implementing language strategies to favor the development of speaking in a content-based classroom?
  • To what extent the use of graphic organizers to decode math information favors the developments of students' mathematical thinking?

Sunday, November 7, 2010

Putting Everything Together



As a result of the data analysis, common trends were organized into categories forming a skeleton that gives life to a story about fostering reading comprehension in the science classroom. The data was questioned, compared against each other, confronted against the theoretical constructs of the research, and examined through a magnifying glass to recognize those key elements that affect students' understanding when reading is implemented.

As a result of this process the following flow chart emerged to explain the effects of intensive reading to decode information in a science content-based class


Arguments are enough to say that reading was like a spring that pushed kids to ask, to inquire, to get to know more about what they were reading. Reading was a powerful input, which according to Gagne’s (1985) Information Processing Theory needs to be decoded to make sense out of it. Hence, developing students’ conceptualization corresponds to an eclectic process; reading should be combined with hands-on activities, videos, pre-reading activities, and post-reading activities; the science classroom needs to reflect an interactive learning experience. Reading is a dynamic process, a social situated practice, and should be measured by means of a comprehensive view. This process is more meaningful to students when they are able to share what they read. Regarding how to measure reading, as many authors have pointed out reading is an invisible act quite challenging to measure. However, using two types of tests to analyze to what extent students are developing reading comprehension skills may be useful. Teachers sometimes can get biased if just one type of test is used over and over.



Without a doubt, intensive reading favors cognition and content development. The quantitative analysis shows that this process is mediated by language as there is a positive correlation between the cloze and CARI tests meaning that people who have low linguistic ability on one test will likely have low overall reading comprehension. However, this dependence between language and content development weakens by the end of the implementation and reveals that other variables apart from language can explain students’ reading comprehension results. The qualitative analysis shows that this process is mediated by teachers’ and learners’ strategies.


When cognition and content are developed, students move from factual knowledge to conceptual knowledge. Three main points should be kept in mind: a) the role of vocabulary, b) grounding concepts, and c) the use of questions.


In conclusion, the strategy implemented was successful as students were able to develop their reading comprehension ability and at the same time increase their knowledge about states of matter, kinetic and potential energy, and types of waves. It is evident that through CLIL practices students are able to improve their language, thinking skills, and understanding of science concepts.

Reference
Gagné, E. D. (1985). La Psicología Cognitiva del Aprendizaje Escolar. Traducción de Paloma Linares. Madrid: Visor Distribuciones, S.A.

Sunday, October 24, 2010

Analyzing Students' Interviews

As it was stated in the data collection plan, a semi-structured interview (Burns, 1999) was chosen in order to get to know students' perceptions about the project, whether or not students were able to recall one of the concepts learned during the implementation, identify the reading strategies students will be more likely to purse when reading a text, and listen to students' recommendations and suggestions for further projects.


  • 1) What was the reading project about?
  • 2) What did you like about it? What didn't you like about it?
  • 3) Could you explain a concept you have learned?
  • 4) What reading strategies did you develop?
  • 5) If you had to read a text, how would you read it, what would you do, why?
  • 6) What recommendations do you have for further projects?



Taking into account that the results of the cloze tests allowed me to place students in three different reading levels: independent, instructional, and frustration; the information gathered from the students' interviews was organized according to those levels with the purpose of clearly establish a difference since the qualitative point of view among the three proposed reading levels.
Click here to have access to the interview chart.







In terms of content
Independent Reading Level: students are able to accurately describe a concept
Instructional Reading Level: students try to explain a concept, but lack accuracy
Frustration Reading Level: students are unable to explain a concept or just do not remember it

In terms of reading strategies
Independent Reading Level: students have an ample range of reading strategies

Instructional Reading Level: students have a limited number of strategies and seem to repeat ineffective ones
Frustration Reading Level: students do not recognize a strategy or repeat ineffective ones

In terms of Thinking Skills
Independent Reading Level: students are able to identify what the project was about and relate both language and content

Instructional Reading Level: students identify one element of the project, whether language or content
Frustration Reading Level: students associate the project to topics studied in class


The aspects aforementioned highlight how progress in terms of content development and cognition is achieved as students move from one level to another. This information is valuable to the teacher as s/he can plan activities address to strengthen the weaknesses of students' place at each of these levels.

Reference
Burns, A. (1999). Collaborative action research for English language teachers. Cambridge: Cambridge University Press.

Sunday, October 17, 2010

Correlation between CLOZE and CARI Tests



Defining what Correlation means
According to the dictionary correlation refers to "the degree to which two or more attributes or measurements on the same group of elements show a tendency to vary together". When a correlation between two variables is established, the researcher can assert that there is a relationship between then: if one variable changes, then the other will also change.

The correlation coefficient corresponds to a statistical value, ranging from negative one (-1) to positive one (1), which describes such relationship. The closer this coefficient gets to 1, the stronger the relationship between the two variables. It is important to warn the reader that correlation does not mean causation. In other words, if the correlation coefficient shows a strong relationship between two variables, the researcher can not argue that one variable causes the other or vice-versa. In addition to find out the correlation coefficient, the researcher needs to verify through a test the significance of that correlation. For the purposes of this study, the program SPSS was used to calculated both the correlation coefficient and its significance.

Testing the Relationship between the Students' scores for CARI Test and the Cloze Test
■ Pre-tests


According to the results of the SPSS program there is a strong significant relationship between these types of tests. Pearson Coefficient equals to 0.733; two-tailed critical value equals to 0 which is less than the alpha significance level (0.05)  

■ Post-tests


According to the results of the SPSS program there is a positive significant relationship between these types of tests. Pearson Coefficient equals to 0.539; two-tailed critical value 0.008 which is less than the alpha significance level (0.05)


Interpreting the Results
  • There is a positive relationship between the CLOZE test results and the CARI test results, which means that if a person does well on one test, he/she will in average do well on the other test.
  • Learners whose reading level according to the CLOZE test corresponds to frustration (below 43% of correct responses) during the pre-stage also achieve low results in the CARI test.
  • The correlation between these two type of tests weakens from the pre-stage to the post-stage; indicating therefore, that there is much more variability among the results at the end of the project. In other words, learners whose reading level according to the CLOZE test corresponds to frustration (below 43% of correct responses) during the post-stage might achieve competent results on the CARI test. Considering that the CLOZE tests determines, in my opinion, a linguistic ability, this result might indicate that under the CLIL framework learners with a low linguistic ability might achieve competent reading comprehension levels, if that comprehension is measured through open questions and the reader has a set of reading strategies.
  • If the Pearson coefficient (r) is squared and written as a percentage, pre-stage = 54%, post-stage = 29%, it can be concluded that the 54% of the variability observed in the CARI test can be explained by the results on the CLOZE test during the pre-stage. On the other hand, 29% of the variability observed in the CARI test can be explained by the results on the CLOZE test during the pre-stage. This information might indicate that during the post-stage there were other variables besides linguistic ability contributing to the results of students reading comprehension in a content area such as motivation, awareness of different reading strategies, and learners' training.

    Useful Links

Sunday, October 10, 2010

Analyzing Students' CLOZE and CARI Test


Defining Basic Concepts
■ Cloze Test
A passage of 100 words is selected and then every fifth word is deleted. The first and last sentences of the whole text are left in their entirety (Sejnost, 2007).
Please click here to see a sample.

■ CARI Test
Test divided into three sections: a) knowledge about reading aids offered by the textbook, b) specific content vocabulary, and c) explicit and implicit information (Sejnost, 2007).
Please click here to see a sample.


Analyzing Results
■ Cloze Test
According to Chatel (2001) Cloze tests allow the researcher to identify three reading levels  depending on the percentage of correct responses. If you want to have more information about what cloze tests show, please click here.

1) Independent Reading Level (58% to 100%)
2) Instructional Reading Level (44% to 57%)
3) Frustration Reading Level (0% to 43%)

The graph below shows the percentage of students within each level after the pre-test and post-test. It is evident that as the result of implementing different activities to strengthen students' reading comprehension, the results are positive. The number of students at frustration level decreased by 50%, while the number of students at instructional and independent levels increased by 150% and 30% respectively.

In addition to use descriptive statistics, inferential statistics help the researcher know if the difference between two means is statistically different. To do so the researcher must establish a hypothesis, run a t-test, and then decide whether to reject or confirm the prior hypothesis. Inferential statistics add validity to the study, since clear differences between two different set of data are established. I use the program SPSS to run a t-test using the data coming from the pre and post test.

The decision after running the t-test was to reject the null hypothesis (any difference between the two means is due to sample error) since p < 0.05. In conclusion, the average post-score (63) significantly exceed the average pre-score (47).  

■ CARI Test
According to Sejnost (2007) a Content Area Reading Inventory Test (CARI) help teachers understand to what extent students are successful at learning content area topics when reading. Taking into account that this type of test measures vocabulary, and understanding of implicit and explicit information; the results obtained can help the teacher assess what type of skills need to be further developed.

Considering the above points, the information was analyzed to determine to what extent each of these three skills was developed. The graph below summarizes the results. As it can be seen, students improved in understanding vocabulary, explicit and implicit information. The greatest change is observed in the vocabulary skill, from an average of 52% to 88%.

In addition, inferential statistics were used to analyze if the results of the pre-test and post-test, taking all the elements together (questions addressing vocabulary, implicit & explicit information) are statistically different.

The decision after running the t-test was to reject the null hypothesis (any difference between the two means is due to sample error) since p < 0.05. In conclusion, the average post-score (69) significantly exceed the average pre-score (47).


References

Chatal, R. (2001). Diagnostic and Instructional Uses of the Cloze Procedure. The Nera Journal, 37 (1), 3-6.

Sejnost, R. (2007). Reading and Writing across Content Areas. Thousand Oaks, CA: Corwin Press.

Friday, October 1, 2010

Analyzing Skype Logs

Image retrieved from 

Besides my journal, another instrument that provided me with qualitative information was the Skype Logs  I collected since the implementation process began.

As Saxton (2008) argues the new emerging technologies offer people a variety of tools such as blogs and Skype to add to our professional development. This author clarifies that “the new Web is participatory, with information flowing in all directions rather than simply from author to reader.” On this regard, these web tools give us the possibility to interact with each other, learn from each other, and construct knowledge through that interaction.

In action research, I see great potential of using Skype as a data collection tool and as an evidence of the ongoing reflection process that characterizes action research. As Burns (1999) claims action research is “evaluative and reflective as it aims to bring about change and improvement in practice, it is participatory as it provides for collaborative investigation by teams of colleagues, practitioners, and researchers” (p.30).


Taking into account the above premises, I used Skype during this research to maintain an ongoing dialogue with my research director fostering in this way the reflection component of the action research cycle and as a data collection tool to gather data about my views on the whole process; these views were validated, confirmed or enlightened by the research director’s views.

Please click here to have access to the coding of the conversations I held with my thesis director throughout the implementation process.


References:
Saxton, E. (2008). Information Tools – Using Blogs, RSS, and Wikis as Professional Resources. Young Adult Library Services 024-294. Chicago: Young Adult Library Services.

Burns, A. (1999). Collaborative action research for English language teachers. Cambridge: Cambridge University Press.

Tuesday, September 14, 2010

Analyzing the Teacher's Journal Information

Image retrieved from

Analyzing is a challenging endeavor. As a novice researcher, I have found quite hard trying to capture the essence of data. Everything seems important since every little piece of information opens a window to start reflecting upon your teaching practice. As Auerbach (2003) points out your research concerns should be kept visible so that the researcher focus his/her attention on relevant text. Following this author's advice, I used a mapping strategy to organize relevant text and start grouping it according to emerging patterns. However, sometimes you get so immersed in the data that you start walking away from your research purposes. Burns (1999) talks about dialogic validity as a criterion that supports the study validity, and in addition, offers the researcher the possibility to test his/her hypothesis, find out new approaches to the data, and argument his/her points of view. She refers to "peer-review" as a powerful tool that encompasses a reflective dialogue with a "critical friend". In my case, my "critical friend" is my research director. 


After showing him the first version of the map I did and establishing a reflective dialogue, I refined it and came up with a more depurated map that shows key concepts around which the analysis will continue.

Please click here to have access to the second version of the map that groups emerging patterns and ideas taken from the teachers' journal.


References
Auerbach, C. F. (2003). Qualitative Data : An Introduction to Coding and Analysis. New York : NYU press.


Burns, A. (1999). Collaborative action research for English language teachers. Cambridge: Cambridge University Press.

Wednesday, September 8, 2010

Analyzing Qualitative Data

Dear reader, I invite you to watch this short video that explains the methodology proposed by Auerbach in relation to categorizing qualitative data and developing a theoretical narrative.


Create your own video slideshow at animoto.com.

After going through steps 1 to 4, I designed this map to visually see how repeating ideas are connected and start naming these related events.

Reference
Auerbach, C. F. (2003). Qualitative Data : An Introduction to Coding and Analysis. New York : NYU press.

Monday, September 6, 2010

How to analyze data?

Image retrieved from

The researcher needs to analyze if the type of information s/he has corresponds to qualitative or quantitative. After, s/he needs to go back to the data collection instruments and decide upon the analysis method that best suit the data. Burns (2010) mentions that the main tools for analyzing and synthesizing qualitative data are a) categorizing: sorting objects into logical groups and developing then a well-structured theoretical framework that explains the phenomena under study, and b) analyzing talk: examining the spoken interaction.

In addition, this author refers to numerical scales and descriptive statistics as the main tools for analyzing quantitative data. However; some researchers might also opt for inferential statistics to take into account correlations, differences between the means obtained, and the analysis of the variance.


The table below provides an overview of the data collection instruments used, they type of data collected, the nature of the data and the method that will be used to analyze the information.








Please click here to have access to the rubric used to assess the students' production resulting from completing the graphic organizers.


Reference
Burns, A. (2010). Doing Action Research in English Language Teaching. New York: Routledge.

Sunday, September 5, 2010

A new phase begins...


Create your own video slideshow at animoto.com.

After finishing the data collection process, the analysis begins. As Burns (2001) points out this phase of action research implies moving towards the research component, where the researcher gets deep into the data by using data analysis procedures as well as analytical tools to explain what happened and why.

Corbin and Strauss (2008) define analysis as looking from different angles at a substance and discovering its dimensions and properties. These two terms allow the researcher to properly define an object, event or action. The researchers break apart the data and later on put all the gained knowledge back together to draw conclusions about the topic being analyzed. Indeed, the knowledge the researcher gains is marked by his/her own beliefs and assumptions. In the words of this authors, “analysis is an interpretative act” (p.47); however, it does not mean that the whole process lacks validity. What action research looks for is understanding the teaching practices and the learners themselves to increase professional knowledge. The researchers’ view will always be “grounded” in the data.

The level of analysis will depend on the researcher’s expertise, background, willingness to go beyond as well as the instructional design of the implementation and the data collection process. Corbin and Strauss (2008) identify the following three levels: description, conceptual ordering, and theorizing. This last is the most difficult to achieve because it implies having a well-developed hierarchy of categories, systematically interrelated, forming a theoretical framework that explains the phenomena under study.

References

Burns, A. (2001). Collaborative Action Research for English Language Teachers. Cambridge: University Press.

Corbin, J. & Strauss, A. (2008). Basics of qualitative research: Grounded theory procedures and techniques. Sage Publications, Inc, 3rd edition.

Thursday, June 3, 2010

Implementation Week 7

A lesson on Electromagnetic Waves

Accessing the data collection tool
Please click here to take a look at the proforma I did for the seventh session.

Image taken from http://www.skybooksusa.com/time-travel/physics/images/spectrum.jpg

Thursday, May 27, 2010

Implementation Week 6

A lesson on Waves

Accessing the data collection tool
Please click here to take a look at the proforma I did for the sixth session.



Image taken from http://i.dailymail.co.uk/i/pix/2009/02/26/article-1156222-03ACD714000005DC-557_634x425.jpg

Thursday, May 20, 2010

Implementation Week 5

A lesson on Sound Waves





Accessing the data collection tool
Please click here to take a look at the proforma I did for the fifth session.







Image taken from http://sound-waves-music.blogspot.com/2009_04_01_archive.html

Thursday, May 13, 2010

Implementation Week 4


A lesson on Potential Energy

Accessing the data collection tool
Please click here to take a look at the proforma I did for the fourth session.



Image taken from http://beancounters.blogs.com/daydreams/dragster_roller_coaster_track.jpg

Thursday, May 6, 2010

Implementation Week 3


A lesson on Kinetic and Potential Energy

Accessing the data collection tool
Please click here to take a look at the proforma I did for the third session.






Reference
Image taken from http://www.enwin.com/kids/images/pic.roller_coaster.jpg

Wednesday, April 28, 2010

Implementation Week 1 & 2


A lesson on Energy

Accessing the data collection tool
Please click here to take a look at the proforma I did for the first and second session.



Defining a Proforma
According to Burns, A. (1999) a proforma is a grid that separates the "descriptive aspects of one's observations from the reflective aspects" allowing the "analysis and interpretation to become more focused".

As the reader may see the research question appears on the top of the chart as well as the constructs: reading, decoding information, and content-based element. The objective of having the research question written in the data collection instrument is to guide the observation process. When a researcher is taking notes, many variables might appear that are not relevant to the object of study; that is why the researcher should always keep in mind the phenomena s/he is interested in investigating. On the second row of the data collection instrument the date and type of activity are written. Then, two columns are displayed; the left column corresponds to the description of what happened in the classroom during that activity, and the second column corresponds to comments I was making as I was re-writing what I had observed in class. These comments were categorized according to the constructs they were describing.


Every time I met with fifth graders and we developed one of the activities whitin the implementation stage, I took one white piece of paper with the research question and the constructs and as the class was going on, I took notes using short sentences or sometimes key words of the events that were related to the constructs. Afterwards at the end of the day, these observations were re-written on the computer and comments regarding the meaning of those observations or inquires were also noted down.

Reference
Burns, A. (1999). Collaborative action research for English language teachers. Cambridge: Cambridge University Press.

Image taken from http://www.alistaircraven.com/java/basic/energy4.jpg

Saturday, April 10, 2010

Collecting Data

According to Sagor (2005) action research, whether descriptive or quasi-experimental, “views data collection through the anthropologists’ lens”; this is to say, that you as teacher - researcher “observe, document, and try to understand” what happens within the classroom on the light of your research question and objectives. Therefore, the researcher should understand the following:
  • “Why the target was hit or missed,
  • How various elements of the theory of action contributed to success or failure, &
  • What could be learned from this undertaking that might help illuminate other related aspects”
In addition to understanding the nature of action research, the researcher needs to take into account how s/he will achieve validity of his/her project. According to Pelto & Pelto as cited by Mills (1999), validity “refers to the degree to which scientific observations actually measure or record what they purport to measure”.


Triangulation becomes then the “best known way of checking for validity. The aim of triangulation is to gather multiple perspectives on the situation being studied”. (Burns, 1999)

Keeping the above points in mind, below you will find a table containing the data collection techniques to be used during this research project.


References:

Burns, A. (1999). Collaborative action research for English language teachers. Cambridge: Cambridge University Press.

Mills, G. (1999). Action Research: A guide for the Teacher Researcher. Prentice Hall.

Sagor, R. (2005). The Action Research Guidebook. USA: Corwin Press.

 
Image taken from
http://teachers.olatheschools.com/acereadingchallengeupdate/files/2009/12/reading-clipart.jpg

Thursday, April 8, 2010

Planning what to do


Wordle: Planning_AR

This stage of the process entails designing the steps you will go through during your research. You need to think of your data collection instruments, the materials you will use, the timeline of your project, and the most important how every single “action” will potentially “answer” your research question.

Please click here to have access to the basic data of my action research project.

Wednesday, April 7, 2010

Meeting my Research Director

I want to start by first writing two main features of action research, pointed out by Burns (1999):

• “It is evaluative and reflective as it aims to bring about change and improvement in practice”

• It is participatory as it provides for collaborative investigation by teams of colleagues, practitioners and researchers.”


This means that research is maximized when it is shared with a fellow. I found meeting my director a very enriching experience since another perspective was brought to my research process. In addition, research is not always a smooth and easy-going process; and you need someone who supports you, and encourages you to go further. Finally, it is when you share your ideas with others, that you refine and sharpen your insights and start constructing knowledge. We are after all, a culture that value “social interaction” and places it at the core of the learning experience.

Finally, it is important to think since the beginning of the project on how valid it is. Besides triangulation, one of the most common ways for validation, Burns (1999) mentions “peer examinations”, this processes entails finding a significant person who can debrief the data with the researcher in order to find commonalities in the analysis and interpretations. When the researcher’s findings are acknowledge by others, validity is increased.


Reference:

Burns, A. (1999). Collaborative action research for English language teachers. Cambridge: Cambridge University Press.

Saturday, April 3, 2010

My research question

When writing a research question, you need to consider the following issues:

  • Does your question include the strategy you are about to implement?
  • Does your question mention the specific context for your research?
  • Does you question state the situation/process/skill you want to improve?
  • Would you be able to observe and measure what you want to investigate?
  • Are you focusing on just one issue?
  • Is your topic relevant to you and your school circumstances?
  • Does your question include researchable concepts?

My question:

What are the effects of implementing intensive reading to decode information in a science content-based class?

Analyzing my question:

References:

Burns, A. (1999). Collaborative action research for English language teachers. Cambridge: Cambridge University Press.

Peckover, R. (n.d.). Defining Action Research. Retrieved April 2nd, 2010 from http://www.squidoo.com/actionresearch

Finding a Focus

As I wrote before I have always had this idea going around my head about reading. As part of the physics curriculum, students have to read a variety of science texts: about the Earth’s surface (i.e. minerals, rocks), Matter and Energy (i.e. states and properties of matter), Waves (i.e. types of waves) and Space (i.e. solar system and the Universe).

Therefore, reading is at the core of the course and success at reading will undoubtedly mean success at the end of the school term. However, fifth grade students (the focus group of this research) struggle at constructing meaning, they usually copy or say the exact words appearing on the textbook.

Several factors such as students’ language level, lack of knowledge regarding how to read science texts, alignment between learners’ needs and the goals of self-access materials used, and the instructional design of the lessons can explain why students are not able to understand a science text, and their results at reading comprehension exams.

The need described above is the one I want to work on; I am interested in helping kids develop their reading strategies to increase their understanding of science texts. In addition, I want to include the language component in my classes as current studies (De Graaff, R. et al. 2007) have shown that the relationship between language and content can lead to more successful teaching practices. This issue is challenging both content-based and language teachers to start implementing strategies that enhance their learners’ learning process considering content and language.

This is a "wordle" I created (http://www.wordle.net/) considering what I have mentioned here:

Wordle: Finding a Focus

To see it bigger, please go to this link

http://www.wordle.net/show/wrdl/1863683/Finding_a_Focus


Reference:

De Graaff, R. et al. (2007). An Observation Tool for Effective L2 Pedagogy in Content and Language Integrated Learning (CLIL). International Journal of Bilingual Education and Bilingualism 10/5, 603-624. Retrieved September 27, 2009 from http://igitur-archive.library.uu.nl/ivlos/2008-0103-200946/UUindex.html


Friday, April 2, 2010

Defining Action Research

Well, knowing what action research is and what it entails is the next step on this journey as theory and practice are always interwoven.
Enjoy this short-video clip I did about action research.
Please, turn on your speakers.

Getting Started

Well, as a result of the Master’s program I am taking and my own interest in becoming a better teacher, a year ago I started to reflect upon my teaching practices. The following questions came across:
  • Am I being effective?
  • Why some students do not achieve the learning expectations on the due date?
  • Do my students understand the reading texts of their textbook?
  • As a content-based teacher do I have to worry about language?
  • How do I approach the language component?



Then, I began reading and guided by my teachers and of course the principles and philosophy of the Master’s program I started to unwrap the mysteries of action research.


Reference:
Image retrieved on April 2nd from http://www.flickr.com/photos/ozyman/443545349/