Something appears to not have loaded correctly.
Click to refresh .
Il Ministero dell’Istruzione – Direzione Generale per gli ordinamenti scolastici, la valutazione e l’internazionalizzazione del sistema nazionale d’istruzione – promuove per l’anno scolastico 2021-2022, il progetto di informatica “Olimpiadi di Problem Solving”, che incentiva competenze chiave per la soluzione di problemi attraverso modelli, metodi e strumenti informatici. Il progetto è rivolto agli alunni della scuola primaria, della scuola secondaria di primo grado e del primo biennio della scuola secondaria di secondo grado, delle scuole Italiane statali e paritarie sul territorio nazionale ed estero. Le competizioni si suddividono in gare di Problem solving “classiche” (risoluzione di problemi), gare di coding, gare di programmazione, gare di maker.
IIS Liceo "V. Julia" Acri (CS). Powered by: Altomari Carmine
Cookie | Durata | Descrizione |
---|---|---|
cookielawinfo-checkbox-advertisement | 1 year | Set by the GDPR Cookie Consent plugin, this cookie is used to record the user consent for the cookies in the "Advertisement" category . |
cookielawinfo-checkbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checkbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
CookieLawInfoConsent | 1 year | Records the default button state of the corresponding category & the status of CCPA. It works only in coordination with the primary cookie. |
PHPSESSID | session | This cookie is native to PHP applications. The cookie is used to store and identify a users' unique session ID for the purpose of managing user session on the website. The cookie is a session cookies and is deleted when all the browser windows are closed. |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
Cookie | Durata | Descrizione |
---|---|---|
cookie_notice_accepted | 1 month | No description available. |
Problem-solving is a mental process that involves discovering, analyzing, and solving problems. The ultimate goal of problem-solving is to overcome obstacles and find a solution that best resolves the issue.
The best strategy for solving a problem depends largely on the unique situation. In some cases, people are better off learning everything they can about the issue and then using factual knowledge to come up with a solution. In other instances, creativity and insight are the best options.
It is not necessary to follow problem-solving steps sequentially, It is common to skip steps or even go back through steps multiple times until the desired solution is reached.
In order to correctly solve a problem, it is often important to follow a series of steps. Researchers sometimes refer to this as the problem-solving cycle. While this cycle is portrayed sequentially, people rarely follow a rigid series of steps to find a solution.
The following steps include developing strategies and organizing knowledge.
While it may seem like an obvious step, identifying the problem is not always as simple as it sounds. In some cases, people might mistakenly identify the wrong source of a problem, which will make attempts to solve it inefficient or even useless.
Some strategies that you might use to figure out the source of a problem include :
After the problem has been identified, it is important to fully define the problem so that it can be solved. You can define a problem by operationally defining each aspect of the problem and setting goals for what aspects of the problem you will address
At this point, you should focus on figuring out which aspects of the problems are facts and which are opinions. State the problem clearly and identify the scope of the solution.
After the problem has been identified, it is time to start brainstorming potential solutions. This step usually involves generating as many ideas as possible without judging their quality. Once several possibilities have been generated, they can be evaluated and narrowed down.
The next step is to develop a strategy to solve the problem. The approach used will vary depending upon the situation and the individual's unique preferences. Common problem-solving strategies include heuristics and algorithms.
Heuristics are often best used when time is of the essence, while algorithms are a better choice when a decision needs to be as accurate as possible.
Before coming up with a solution, you need to first organize the available information. What do you know about the problem? What do you not know? The more information that is available the better prepared you will be to come up with an accurate solution.
When approaching a problem, it is important to make sure that you have all the data you need. Making a decision without adequate information can lead to biased or inaccurate results.
Of course, we don't always have unlimited money, time, and other resources to solve a problem. Before you begin to solve a problem, you need to determine how high priority it is.
If it is an important problem, it is probably worth allocating more resources to solving it. If, however, it is a fairly unimportant problem, then you do not want to spend too much of your available resources on coming up with a solution.
At this stage, it is important to consider all of the factors that might affect the problem at hand. This includes looking at the available resources, deadlines that need to be met, and any possible risks involved in each solution. After careful evaluation, a decision can be made about which solution to pursue.
After selecting a problem-solving strategy, it is time to put the plan into action and see if it works. This step might involve trying out different solutions to see which one is the most effective.
It is also important to monitor the situation after implementing a solution to ensure that the problem has been solved and that no new problems have arisen as a result of the proposed solution.
Effective problem-solvers tend to monitor their progress as they work towards a solution. If they are not making good progress toward reaching their goal, they will reevaluate their approach or look for new strategies .
After a solution has been reached, it is important to evaluate the results to determine if it is the best possible solution to the problem. This evaluation might be immediate, such as checking the results of a math problem to ensure the answer is correct, or it can be delayed, such as evaluating the success of a therapy program after several months of treatment.
Once a problem has been solved, it is important to take some time to reflect on the process that was used and evaluate the results. This will help you to improve your problem-solving skills and become more efficient at solving future problems.
It is important to remember that there are many different problem-solving processes with different steps, and this is just one example. Problem-solving in real-world situations requires a great deal of resourcefulness, flexibility, resilience, and continuous interaction with the environment.
Hosted by therapist Amy Morin, LCSW, this episode of The Verywell Mind Podcast shares how you can stop dwelling in a negative mindset.
Follow Now : Apple Podcasts / Spotify / Google Podcasts
You can become a better problem solving by:
It's important to communicate openly and honestly with your partner about what's going on. Try to see things from their perspective as well as your own. Work together to find a resolution that works for both of you. Be willing to compromise and accept that there may not be a perfect solution.
Take breaks if things are getting too heated, and come back to the problem when you feel calm and collected. Don't try to fix every problem on your own—consider asking a therapist or counselor for help and insight.
If you've tried everything and there doesn't seem to be a way to fix the problem, you may have to learn to accept it. This can be difficult, but try to focus on the positive aspects of your life and remember that every situation is temporary. Don't dwell on what's going wrong—instead, think about what's going right. Find support by talking to friends or family. Seek professional help if you're having trouble coping.
Davidson JE, Sternberg RJ, editors. The Psychology of Problem Solving . Cambridge University Press; 2003. doi:10.1017/CBO9780511615771
Sarathy V. Real world problem-solving . Front Hum Neurosci . 2018;12:261. Published 2018 Jun 26. doi:10.3389/fnhum.2018.00261
By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."
In this episode of the McKinsey Podcast , Simon London speaks with Charles Conn, CEO of venture-capital firm Oxford Sciences Innovation, and McKinsey senior partner Hugo Sarrazin about the complexities of different problem-solving strategies.
Podcast transcript
Simon London: Hello, and welcome to this episode of the McKinsey Podcast , with me, Simon London. What’s the number-one skill you need to succeed professionally? Salesmanship, perhaps? Or a facility with statistics? Or maybe the ability to communicate crisply and clearly? Many would argue that at the very top of the list comes problem solving: that is, the ability to think through and come up with an optimal course of action to address any complex challenge—in business, in public policy, or indeed in life.
Looked at this way, it’s no surprise that McKinsey takes problem solving very seriously, testing for it during the recruiting process and then honing it, in McKinsey consultants, through immersion in a structured seven-step method. To discuss the art of problem solving, I sat down in California with McKinsey senior partner Hugo Sarrazin and also with Charles Conn. Charles is a former McKinsey partner, entrepreneur, executive, and coauthor of the book Bulletproof Problem Solving: The One Skill That Changes Everything [John Wiley & Sons, 2018].
Charles and Hugo, welcome to the podcast. Thank you for being here.
Hugo Sarrazin: Our pleasure.
Charles Conn: It’s terrific to be here.
Simon London: Problem solving is a really interesting piece of terminology. It could mean so many different things. I have a son who’s a teenage climber. They talk about solving problems. Climbing is problem solving. Charles, when you talk about problem solving, what are you talking about?
Charles Conn: For me, problem solving is the answer to the question “What should I do?” It’s interesting when there’s uncertainty and complexity, and when it’s meaningful because there are consequences. Your son’s climbing is a perfect example. There are consequences, and it’s complicated, and there’s uncertainty—can he make that grab? I think we can apply that same frame almost at any level. You can think about questions like “What town would I like to live in?” or “Should I put solar panels on my roof?”
You might think that’s a funny thing to apply problem solving to, but in my mind it’s not fundamentally different from business problem solving, which answers the question “What should my strategy be?” Or problem solving at the policy level: “How do we combat climate change?” “Should I support the local school bond?” I think these are all part and parcel of the same type of question, “What should I do?”
I’m a big fan of structured problem solving. By following steps, we can more clearly understand what problem it is we’re solving, what are the components of the problem that we’re solving, which components are the most important ones for us to pay attention to, which analytic techniques we should apply to those, and how we can synthesize what we’ve learned back into a compelling story. That’s all it is, at its heart.
I think sometimes when people think about seven steps, they assume that there’s a rigidity to this. That’s not it at all. It’s actually to give you the scope for creativity, which often doesn’t exist when your problem solving is muddled.
Simon London: You were just talking about the seven-step process. That’s what’s written down in the book, but it’s a very McKinsey process as well. Without getting too deep into the weeds, let’s go through the steps, one by one. You were just talking about problem definition as being a particularly important thing to get right first. That’s the first step. Hugo, tell us about that.
Hugo Sarrazin: It is surprising how often people jump past this step and make a bunch of assumptions. The most powerful thing is to step back and ask the basic questions—“What are we trying to solve? What are the constraints that exist? What are the dependencies?” Let’s make those explicit and really push the thinking and defining. At McKinsey, we spend an enormous amount of time in writing that little statement, and the statement, if you’re a logic purist, is great. You debate. “Is it an ‘or’? Is it an ‘and’? What’s the action verb?” Because all these specific words help you get to the heart of what matters.
Simon London: So this is a concise problem statement.
Hugo Sarrazin: Yeah. It’s not like “Can we grow in Japan?” That’s interesting, but it is “What, specifically, are we trying to uncover in the growth of a product in Japan? Or a segment in Japan? Or a channel in Japan?” When you spend an enormous amount of time, in the first meeting of the different stakeholders, debating this and having different people put forward what they think the problem definition is, you realize that people have completely different views of why they’re here. That, to me, is the most important step.
Charles Conn: I would agree with that. For me, the problem context is critical. When we understand “What are the forces acting upon your decision maker? How quickly is the answer needed? With what precision is the answer needed? Are there areas that are off limits or areas where we would particularly like to find our solution? Is the decision maker open to exploring other areas?” then you not only become more efficient, and move toward what we call the critical path in problem solving, but you also make it so much more likely that you’re not going to waste your time or your decision maker’s time.
How often do especially bright young people run off with half of the idea about what the problem is and start collecting data and start building models—only to discover that they’ve really gone off half-cocked.
Hugo Sarrazin: Yeah.
Charles Conn: And in the wrong direction.
Simon London: OK. So step one—and there is a real art and a structure to it—is define the problem. Step two, Charles?
Charles Conn: My favorite step is step two, which is to use logic trees to disaggregate the problem. Every problem we’re solving has some complexity and some uncertainty in it. The only way that we can really get our team working on the problem is to take the problem apart into logical pieces.
What we find, of course, is that the way to disaggregate the problem often gives you an insight into the answer to the problem quite quickly. I love to do two or three different cuts at it, each one giving a bit of a different insight into what might be going wrong. By doing sensible disaggregations, using logic trees, we can figure out which parts of the problem we should be looking at, and we can assign those different parts to team members.
Simon London: What’s a good example of a logic tree on a sort of ratable problem?
Charles Conn: Maybe the easiest one is the classic profit tree. Almost in every business that I would take a look at, I would start with a profit or return-on-assets tree. In its simplest form, you have the components of revenue, which are price and quantity, and the components of cost, which are cost and quantity. Each of those can be broken out. Cost can be broken into variable cost and fixed cost. The components of price can be broken into what your pricing scheme is. That simple tree often provides insight into what’s going on in a business or what the difference is between that business and the competitors.
If we add the leg, which is “What’s the asset base or investment element?”—so profit divided by assets—then we can ask the question “Is the business using its investments sensibly?” whether that’s in stores or in manufacturing or in transportation assets. I hope we can see just how simple this is, even though we’re describing it in words.
When I went to work with Gordon Moore at the Moore Foundation, the problem that he asked us to look at was “How can we save Pacific salmon?” Now, that sounds like an impossible question, but it was amenable to precisely the same type of disaggregation and allowed us to organize what became a 15-year effort to improve the likelihood of good outcomes for Pacific salmon.
Simon London: Now, is there a danger that your logic tree can be impossibly large? This, I think, brings us onto the third step in the process, which is that you have to prioritize.
Charles Conn: Absolutely. The third step, which we also emphasize, along with good problem definition, is rigorous prioritization—we ask the questions “How important is this lever or this branch of the tree in the overall outcome that we seek to achieve? How much can I move that lever?” Obviously, we try and focus our efforts on ones that have a big impact on the problem and the ones that we have the ability to change. With salmon, ocean conditions turned out to be a big lever, but not one that we could adjust. We focused our attention on fish habitats and fish-harvesting practices, which were big levers that we could affect.
People spend a lot of time arguing about branches that are either not important or that none of us can change. We see it in the public square. When we deal with questions at the policy level—“Should you support the death penalty?” “How do we affect climate change?” “How can we uncover the causes and address homelessness?”—it’s even more important that we’re focusing on levers that are big and movable.
Simon London: Let’s move swiftly on to step four. You’ve defined your problem, you disaggregate it, you prioritize where you want to analyze—what you want to really look at hard. Then you got to the work plan. Now, what does that mean in practice?
Hugo Sarrazin: Depending on what you’ve prioritized, there are many things you could do. It could be breaking the work among the team members so that people have a clear piece of the work to do. It could be defining the specific analyses that need to get done and executed, and being clear on time lines. There’s always a level-one answer, there’s a level-two answer, there’s a level-three answer. Without being too flippant, I can solve any problem during a good dinner with wine. It won’t have a whole lot of backing.
Simon London: Not going to have a lot of depth to it.
Hugo Sarrazin: No, but it may be useful as a starting point. If the stakes are not that high, that could be OK. If it’s really high stakes, you may need level three and have the whole model validated in three different ways. You need to find a work plan that reflects the level of precision, the time frame you have, and the stakeholders you need to bring along in the exercise.
Charles Conn: I love the way you’ve described that, because, again, some people think of problem solving as a linear thing, but of course what’s critical is that it’s iterative. As you say, you can solve the problem in one day or even one hour.
Charles Conn: We encourage our teams everywhere to do that. We call it the one-day answer or the one-hour answer. In work planning, we’re always iterating. Every time you see a 50-page work plan that stretches out to three months, you know it’s wrong. It will be outmoded very quickly by that learning process that you described. Iterative problem solving is a critical part of this. Sometimes, people think work planning sounds dull, but it isn’t. It’s how we know what’s expected of us and when we need to deliver it and how we’re progressing toward the answer. It’s also the place where we can deal with biases. Bias is a feature of every human decision-making process. If we design our team interactions intelligently, we can avoid the worst sort of biases.
Simon London: Here we’re talking about cognitive biases primarily, right? It’s not that I’m biased against you because of your accent or something. These are the cognitive biases that behavioral sciences have shown we all carry around, things like anchoring, overoptimism—these kinds of things.
Both: Yeah.
Charles Conn: Availability bias is the one that I’m always alert to. You think you’ve seen the problem before, and therefore what’s available is your previous conception of it—and we have to be most careful about that. In any human setting, we also have to be careful about biases that are based on hierarchies, sometimes called sunflower bias. I’m sure, Hugo, with your teams, you make sure that the youngest team members speak first. Not the oldest team members, because it’s easy for people to look at who’s senior and alter their own creative approaches.
Hugo Sarrazin: It’s helpful, at that moment—if someone is asserting a point of view—to ask the question “This was true in what context?” You’re trying to apply something that worked in one context to a different one. That can be deadly if the context has changed, and that’s why organizations struggle to change. You promote all these people because they did something that worked well in the past, and then there’s a disruption in the industry, and they keep doing what got them promoted even though the context has changed.
Simon London: Right. Right.
Hugo Sarrazin: So it’s the same thing in problem solving.
Charles Conn: And it’s why diversity in our teams is so important. It’s one of the best things about the world that we’re in now. We’re likely to have people from different socioeconomic, ethnic, and national backgrounds, each of whom sees problems from a slightly different perspective. It is therefore much more likely that the team will uncover a truly creative and clever approach to problem solving.
Simon London: Let’s move on to step five. You’ve done your work plan. Now you’ve actually got to do the analysis. The thing that strikes me here is that the range of tools that we have at our disposal now, of course, is just huge, particularly with advances in computation, advanced analytics. There’s so many things that you can apply here. Just talk about the analysis stage. How do you pick the right tools?
Charles Conn: For me, the most important thing is that we start with simple heuristics and explanatory statistics before we go off and use the big-gun tools. We need to understand the shape and scope of our problem before we start applying these massive and complex analytical approaches.
Simon London: Would you agree with that?
Hugo Sarrazin: I agree. I think there are so many wonderful heuristics. You need to start there before you go deep into the modeling exercise. There’s an interesting dynamic that’s happening, though. In some cases, for some types of problems, it is even better to set yourself up to maximize your learning. Your problem-solving methodology is test and learn, test and learn, test and learn, and iterate. That is a heuristic in itself, the A/B testing that is used in many parts of the world. So that’s a problem-solving methodology. It’s nothing different. It just uses technology and feedback loops in a fast way. The other one is exploratory data analysis. When you’re dealing with a large-scale problem, and there’s so much data, I can get to the heuristics that Charles was talking about through very clever visualization of data.
You test with your data. You need to set up an environment to do so, but don’t get caught up in neural-network modeling immediately. You’re testing, you’re checking—“Is the data right? Is it sound? Does it make sense?”—before you launch too far.
Simon London: You do hear these ideas—that if you have a big enough data set and enough algorithms, they’re going to find things that you just wouldn’t have spotted, find solutions that maybe you wouldn’t have thought of. Does machine learning sort of revolutionize the problem-solving process? Or are these actually just other tools in the toolbox for structured problem solving?
Charles Conn: It can be revolutionary. There are some areas in which the pattern recognition of large data sets and good algorithms can help us see things that we otherwise couldn’t see. But I do think it’s terribly important we don’t think that this particular technique is a substitute for superb problem solving, starting with good problem definition. Many people use machine learning without understanding algorithms that themselves can have biases built into them. Just as 20 years ago, when we were doing statistical analysis, we knew that we needed good model definition, we still need a good understanding of our algorithms and really good problem definition before we launch off into big data sets and unknown algorithms.
Simon London: Step six. You’ve done your analysis.
Charles Conn: I take six and seven together, and this is the place where young problem solvers often make a mistake. They’ve got their analysis, and they assume that’s the answer, and of course it isn’t the answer. The ability to synthesize the pieces that came out of the analysis and begin to weave those into a story that helps people answer the question “What should I do?” This is back to where we started. If we can’t synthesize, and we can’t tell a story, then our decision maker can’t find the answer to “What should I do?”
Simon London: But, again, these final steps are about motivating people to action, right?
Charles Conn: Yeah.
Simon London: I am slightly torn about the nomenclature of problem solving because it’s on paper, right? Until you motivate people to action, you actually haven’t solved anything.
Charles Conn: I love this question because I think decision-making theory, without a bias to action, is a waste of time. Everything in how I approach this is to help people take action that makes the world better.
Simon London: Hence, these are absolutely critical steps. If you don’t do this well, you’ve just got a bunch of analysis.
Charles Conn: We end up in exactly the same place where we started, which is people speaking across each other, past each other in the public square, rather than actually working together, shoulder to shoulder, to crack these important problems.
Simon London: In the real world, we have a lot of uncertainty—arguably, increasing uncertainty. How do good problem solvers deal with that?
Hugo Sarrazin: At every step of the process. In the problem definition, when you’re defining the context, you need to understand those sources of uncertainty and whether they’re important or not important. It becomes important in the definition of the tree.
You need to think carefully about the branches of the tree that are more certain and less certain as you define them. They don’t have equal weight just because they’ve got equal space on the page. Then, when you’re prioritizing, your prioritization approach may put more emphasis on things that have low probability but huge impact—or, vice versa, may put a lot of priority on things that are very likely and, hopefully, have a reasonable impact. You can introduce that along the way. When you come back to the synthesis, you just need to be nuanced about what you’re understanding, the likelihood.
Often, people lack humility in the way they make their recommendations: “This is the answer.” They’re very precise, and I think we would all be well-served to say, “This is a likely answer under the following sets of conditions” and then make the level of uncertainty clearer, if that is appropriate. It doesn’t mean you’re always in the gray zone; it doesn’t mean you don’t have a point of view. It just means that you can be explicit about the certainty of your answer when you make that recommendation.
Simon London: So it sounds like there is an underlying principle: “Acknowledge and embrace the uncertainty. Don’t pretend that it isn’t there. Be very clear about what the uncertainties are up front, and then build that into every step of the process.”
Hugo Sarrazin: Every step of the process.
Simon London: Yeah. We have just walked through a particular structured methodology for problem solving. But, of course, this is not the only structured methodology for problem solving. One that is also very well-known is design thinking, which comes at things very differently. So, Hugo, I know you have worked with a lot of designers. Just give us a very quick summary. Design thinking—what is it, and how does it relate?
Hugo Sarrazin: It starts with an incredible amount of empathy for the user and uses that to define the problem. It does pause and go out in the wild and spend an enormous amount of time seeing how people interact with objects, seeing the experience they’re getting, seeing the pain points or joy—and uses that to infer and define the problem.
Simon London: Problem definition, but out in the world.
Hugo Sarrazin: With an enormous amount of empathy. There’s a huge emphasis on empathy. Traditional, more classic problem solving is you define the problem based on an understanding of the situation. This one almost presupposes that we don’t know the problem until we go see it. The second thing is you need to come up with multiple scenarios or answers or ideas or concepts, and there’s a lot of divergent thinking initially. That’s slightly different, versus the prioritization, but not for long. Eventually, you need to kind of say, “OK, I’m going to converge again.” Then you go and you bring things back to the customer and get feedback and iterate. Then you rinse and repeat, rinse and repeat. There’s a lot of tactile building, along the way, of prototypes and things like that. It’s very iterative.
Simon London: So, Charles, are these complements or are these alternatives?
Charles Conn: I think they’re entirely complementary, and I think Hugo’s description is perfect. When we do problem definition well in classic problem solving, we are demonstrating the kind of empathy, at the very beginning of our problem, that design thinking asks us to approach. When we ideate—and that’s very similar to the disaggregation, prioritization, and work-planning steps—we do precisely the same thing, and often we use contrasting teams, so that we do have divergent thinking. The best teams allow divergent thinking to bump them off whatever their initial biases in problem solving are. For me, design thinking gives us a constant reminder of creativity, empathy, and the tactile nature of problem solving, but it’s absolutely complementary, not alternative.
Simon London: I think, in a world of cross-functional teams, an interesting question is do people with design-thinking backgrounds really work well together with classical problem solvers? How do you make that chemistry happen?
Hugo Sarrazin: Yeah, it is not easy when people have spent an enormous amount of time seeped in design thinking or user-centric design, whichever word you want to use. If the person who’s applying classic problem-solving methodology is very rigid and mechanical in the way they’re doing it, there could be an enormous amount of tension. If there’s not clarity in the role and not clarity in the process, I think having the two together can be, sometimes, problematic.
The second thing that happens often is that the artifacts the two methodologies try to gravitate toward can be different. Classic problem solving often gravitates toward a model; design thinking migrates toward a prototype. Rather than writing a big deck with all my supporting evidence, they’ll bring an example, a thing, and that feels different. Then you spend your time differently to achieve those two end products, so that’s another source of friction.
Now, I still think it can be an incredibly powerful thing to have the two—if there are the right people with the right mind-set, if there is a team that is explicit about the roles, if we’re clear about the kind of outcomes we are attempting to bring forward. There’s an enormous amount of collaborativeness and respect.
Simon London: But they have to respect each other’s methodology and be prepared to flex, maybe, a little bit, in how this process is going to work.
Hugo Sarrazin: Absolutely.
Simon London: The other area where, it strikes me, there could be a little bit of a different sort of friction is this whole concept of the day-one answer, which is what we were just talking about in classical problem solving. Now, you know that this is probably not going to be your final answer, but that’s how you begin to structure the problem. Whereas I would imagine your design thinkers—no, they’re going off to do their ethnographic research and get out into the field, potentially for a long time, before they come back with at least an initial hypothesis.
Hugo Sarrazin: That is a great callout, and that’s another difference. Designers typically will like to soak into the situation and avoid converging too quickly. There’s optionality and exploring different options. There’s a strong belief that keeps the solution space wide enough that you can come up with more radical ideas. If there’s a large design team or many designers on the team, and you come on Friday and say, “What’s our week-one answer?” they’re going to struggle. They’re not going to be comfortable, naturally, to give that answer. It doesn’t mean they don’t have an answer; it’s just not where they are in their thinking process.
Simon London: I think we are, sadly, out of time for today. But Charles and Hugo, thank you so much.
Charles Conn: It was a pleasure to be here, Simon.
Hugo Sarrazin: It was a pleasure. Thank you.
Simon London: And thanks, as always, to you, our listeners, for tuning into this episode of the McKinsey Podcast . If you want to learn more about problem solving, you can find the book, Bulletproof Problem Solving: The One Skill That Changes Everything , online or order it through your local bookstore. To learn more about McKinsey, you can of course find us at McKinsey.com.
Charles Conn is CEO of Oxford Sciences Innovation and an alumnus of McKinsey’s Sydney office. Hugo Sarrazin is a senior partner in the Silicon Valley office, where Simon London, a member of McKinsey Publishing, is also based.
Related articles.
Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.
Humanities and Social Sciences Communications volume 10 , Article number: 16 ( 2023 ) Cite this article
18k Accesses
20 Citations
3 Altmetric
Metrics details
Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field of education as well as a key competence for learners in the 21st century. However, the effectiveness of collaborative problem-solving in promoting students’ critical thinking remains uncertain. This current research presents the major findings of a meta-analysis of 36 pieces of the literature revealed in worldwide educational periodicals during the 21st century to identify the effectiveness of collaborative problem-solving in promoting students’ critical thinking and to determine, based on evidence, whether and to what extent collaborative problem solving can result in a rise or decrease in critical thinking. The findings show that (1) collaborative problem solving is an effective teaching approach to foster students’ critical thinking, with a significant overall effect size (ES = 0.82, z = 12.78, P < 0.01, 95% CI [0.69, 0.95]); (2) in respect to the dimensions of critical thinking, collaborative problem solving can significantly and successfully enhance students’ attitudinal tendencies (ES = 1.17, z = 7.62, P < 0.01, 95% CI[0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z = 11.55, P < 0.01, 95% CI[0.58, 0.82]); and (3) the teaching type (chi 2 = 7.20, P < 0.05), intervention duration (chi 2 = 12.18, P < 0.01), subject area (chi 2 = 13.36, P < 0.05), group size (chi 2 = 8.77, P < 0.05), and learning scaffold (chi 2 = 9.03, P < 0.01) all have an impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. On the basis of these results, recommendations are made for further study and instruction to better support students’ critical thinking in the context of collaborative problem-solving.
Introduction.
Although critical thinking has a long history in research, the concept of critical thinking, which is regarded as an essential competence for learners in the 21st century, has recently attracted more attention from researchers and teaching practitioners (National Research Council, 2012 ). Critical thinking should be the core of curriculum reform based on key competencies in the field of education (Peng and Deng, 2017 ) because students with critical thinking can not only understand the meaning of knowledge but also effectively solve practical problems in real life even after knowledge is forgotten (Kek and Huijser, 2011 ). The definition of critical thinking is not universal (Ennis, 1989 ; Castle, 2009 ; Niu et al., 2013 ). In general, the definition of critical thinking is a self-aware and self-regulated thought process (Facione, 1990 ; Niu et al., 2013 ). It refers to the cognitive skills needed to interpret, analyze, synthesize, reason, and evaluate information as well as the attitudinal tendency to apply these abilities (Halpern, 2001 ). The view that critical thinking can be taught and learned through curriculum teaching has been widely supported by many researchers (e.g., Kuncel, 2011 ; Leng and Lu, 2020 ), leading to educators’ efforts to foster it among students. In the field of teaching practice, there are three types of courses for teaching critical thinking (Ennis, 1989 ). The first is an independent curriculum in which critical thinking is taught and cultivated without involving the knowledge of specific disciplines; the second is an integrated curriculum in which critical thinking is integrated into the teaching of other disciplines as a clear teaching goal; and the third is a mixed curriculum in which critical thinking is taught in parallel to the teaching of other disciplines for mixed teaching training. Furthermore, numerous measuring tools have been developed by researchers and educators to measure critical thinking in the context of teaching practice. These include standardized measurement tools, such as WGCTA, CCTST, CCTT, and CCTDI, which have been verified by repeated experiments and are considered effective and reliable by international scholars (Facione and Facione, 1992 ). In short, descriptions of critical thinking, including its two dimensions of attitudinal tendency and cognitive skills, different types of teaching courses, and standardized measurement tools provide a complex normative framework for understanding, teaching, and evaluating critical thinking.
Cultivating critical thinking in curriculum teaching can start with a problem, and one of the most popular critical thinking instructional approaches is problem-based learning (Liu et al., 2020 ). Duch et al. ( 2001 ) noted that problem-based learning in group collaboration is progressive active learning, which can improve students’ critical thinking and problem-solving skills. Collaborative problem-solving is the organic integration of collaborative learning and problem-based learning, which takes learners as the center of the learning process and uses problems with poor structure in real-world situations as the starting point for the learning process (Liang et al., 2017 ). Students learn the knowledge needed to solve problems in a collaborative group, reach a consensus on problems in the field, and form solutions through social cooperation methods, such as dialogue, interpretation, questioning, debate, negotiation, and reflection, thus promoting the development of learners’ domain knowledge and critical thinking (Cindy, 2004 ; Liang et al., 2017 ).
Collaborative problem-solving has been widely used in the teaching practice of critical thinking, and several studies have attempted to conduct a systematic review and meta-analysis of the empirical literature on critical thinking from various perspectives. However, little attention has been paid to the impact of collaborative problem-solving on critical thinking. Therefore, the best approach for developing and enhancing critical thinking throughout collaborative problem-solving is to examine how to implement critical thinking instruction; however, this issue is still unexplored, which means that many teachers are incapable of better instructing critical thinking (Leng and Lu, 2020 ; Niu et al., 2013 ). For example, Huber ( 2016 ) provided the meta-analysis findings of 71 publications on gaining critical thinking over various time frames in college with the aim of determining whether critical thinking was truly teachable. These authors found that learners significantly improve their critical thinking while in college and that critical thinking differs with factors such as teaching strategies, intervention duration, subject area, and teaching type. The usefulness of collaborative problem-solving in fostering students’ critical thinking, however, was not determined by this study, nor did it reveal whether there existed significant variations among the different elements. A meta-analysis of 31 pieces of educational literature was conducted by Liu et al. ( 2020 ) to assess the impact of problem-solving on college students’ critical thinking. These authors found that problem-solving could promote the development of critical thinking among college students and proposed establishing a reasonable group structure for problem-solving in a follow-up study to improve students’ critical thinking. Additionally, previous empirical studies have reached inconclusive and even contradictory conclusions about whether and to what extent collaborative problem-solving increases or decreases critical thinking levels. As an illustration, Yang et al. ( 2008 ) carried out an experiment on the integrated curriculum teaching of college students based on a web bulletin board with the goal of fostering participants’ critical thinking in the context of collaborative problem-solving. These authors’ research revealed that through sharing, debating, examining, and reflecting on various experiences and ideas, collaborative problem-solving can considerably enhance students’ critical thinking in real-life problem situations. In contrast, collaborative problem-solving had a positive impact on learners’ interaction and could improve learning interest and motivation but could not significantly improve students’ critical thinking when compared to traditional classroom teaching, according to research by Naber and Wyatt ( 2014 ) and Sendag and Odabasi ( 2009 ) on undergraduate and high school students, respectively.
The above studies show that there is inconsistency regarding the effectiveness of collaborative problem-solving in promoting students’ critical thinking. Therefore, it is essential to conduct a thorough and trustworthy review to detect and decide whether and to what degree collaborative problem-solving can result in a rise or decrease in critical thinking. Meta-analysis is a quantitative analysis approach that is utilized to examine quantitative data from various separate studies that are all focused on the same research topic. This approach characterizes the effectiveness of its impact by averaging the effect sizes of numerous qualitative studies in an effort to reduce the uncertainty brought on by independent research and produce more conclusive findings (Lipsey and Wilson, 2001 ).
This paper used a meta-analytic approach and carried out a meta-analysis to examine the effectiveness of collaborative problem-solving in promoting students’ critical thinking in order to make a contribution to both research and practice. The following research questions were addressed by this meta-analysis:
What is the overall effect size of collaborative problem-solving in promoting students’ critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills)?
How are the disparities between the study conclusions impacted by various moderating variables if the impacts of various experimental designs in the included studies are heterogeneous?
This research followed the strict procedures (e.g., database searching, identification, screening, eligibility, merging, duplicate removal, and analysis of included studies) of Cooper’s ( 2010 ) proposed meta-analysis approach for examining quantitative data from various separate studies that are all focused on the same research topic. The relevant empirical research that appeared in worldwide educational periodicals within the 21st century was subjected to this meta-analysis using Rev-Man 5.4. The consistency of the data extracted separately by two researchers was tested using Cohen’s kappa coefficient, and a publication bias test and a heterogeneity test were run on the sample data to ascertain the quality of this meta-analysis.
There were three stages to the data collection process for this meta-analysis, as shown in Fig. 1 , which shows the number of articles included and eliminated during the selection process based on the statement and study eligibility criteria.
This flowchart shows the number of records identified, included and excluded in the article.
First, the databases used to systematically search for relevant articles were the journal papers of the Web of Science Core Collection and the Chinese Core source journal, as well as the Chinese Social Science Citation Index (CSSCI) source journal papers included in CNKI. These databases were selected because they are credible platforms that are sources of scholarly and peer-reviewed information with advanced search tools and contain literature relevant to the subject of our topic from reliable researchers and experts. The search string with the Boolean operator used in the Web of Science was “TS = (((“critical thinking” or “ct” and “pretest” or “posttest”) or (“critical thinking” or “ct” and “control group” or “quasi experiment” or “experiment”)) and (“collaboration” or “collaborative learning” or “CSCL”) and (“problem solving” or “problem-based learning” or “PBL”))”. The research area was “Education Educational Research”, and the search period was “January 1, 2000, to December 30, 2021”. A total of 412 papers were obtained. The search string with the Boolean operator used in the CNKI was “SU = (‘critical thinking’*‘collaboration’ + ‘critical thinking’*‘collaborative learning’ + ‘critical thinking’*‘CSCL’ + ‘critical thinking’*‘problem solving’ + ‘critical thinking’*‘problem-based learning’ + ‘critical thinking’*‘PBL’ + ‘critical thinking’*‘problem oriented’) AND FT = (‘experiment’ + ‘quasi experiment’ + ‘pretest’ + ‘posttest’ + ‘empirical study’)” (translated into Chinese when searching). A total of 56 studies were found throughout the search period of “January 2000 to December 2021”. From the databases, all duplicates and retractions were eliminated before exporting the references into Endnote, a program for managing bibliographic references. In all, 466 studies were found.
Second, the studies that matched the inclusion and exclusion criteria for the meta-analysis were chosen by two researchers after they had reviewed the abstracts and titles of the gathered articles, yielding a total of 126 studies.
Third, two researchers thoroughly reviewed each included article’s whole text in accordance with the inclusion and exclusion criteria. Meanwhile, a snowball search was performed using the references and citations of the included articles to ensure complete coverage of the articles. Ultimately, 36 articles were kept.
Two researchers worked together to carry out this entire process, and a consensus rate of almost 94.7% was reached after discussion and negotiation to clarify any emerging differences.
Since not all the retrieved studies matched the criteria for this meta-analysis, eligibility criteria for both inclusion and exclusion were developed as follows:
The publication language of the included studies was limited to English and Chinese, and the full text could be obtained. Articles that did not meet the publication language and articles not published between 2000 and 2021 were excluded.
The research design of the included studies must be empirical and quantitative studies that can assess the effect of collaborative problem-solving on the development of critical thinking. Articles that could not identify the causal mechanisms by which collaborative problem-solving affects critical thinking, such as review articles and theoretical articles, were excluded.
The research method of the included studies must feature a randomized control experiment or a quasi-experiment, or a natural experiment, which have a higher degree of internal validity with strong experimental designs and can all plausibly provide evidence that critical thinking and collaborative problem-solving are causally related. Articles with non-experimental research methods, such as purely correlational or observational studies, were excluded.
The participants of the included studies were only students in school, including K-12 students and college students. Articles in which the participants were non-school students, such as social workers or adult learners, were excluded.
The research results of the included studies must mention definite signs that may be utilized to gauge critical thinking’s impact (e.g., sample size, mean value, or standard deviation). Articles that lacked specific measurement indicators for critical thinking and could not calculate the effect size were excluded.
In order to perform a meta-analysis, it is necessary to collect the most important information from the articles, codify that information’s properties, and convert descriptive data into quantitative data. Therefore, this study designed a data coding template (see Table 1 ). Ultimately, 16 coding fields were retained.
The designed data-coding template consisted of three pieces of information. Basic information about the papers was included in the descriptive information: the publishing year, author, serial number, and title of the paper.
The variable information for the experimental design had three variables: the independent variable (instruction method), the dependent variable (critical thinking), and the moderating variable (learning stage, teaching type, intervention duration, learning scaffold, group size, measuring tool, and subject area). Depending on the topic of this study, the intervention strategy, as the independent variable, was coded into collaborative and non-collaborative problem-solving. The dependent variable, critical thinking, was coded as a cognitive skill and an attitudinal tendency. And seven moderating variables were created by grouping and combining the experimental design variables discovered within the 36 studies (see Table 1 ), where learning stages were encoded as higher education, high school, middle school, and primary school or lower; teaching types were encoded as mixed courses, integrated courses, and independent courses; intervention durations were encoded as 0–1 weeks, 1–4 weeks, 4–12 weeks, and more than 12 weeks; group sizes were encoded as 2–3 persons, 4–6 persons, 7–10 persons, and more than 10 persons; learning scaffolds were encoded as teacher-supported learning scaffold, technique-supported learning scaffold, and resource-supported learning scaffold; measuring tools were encoded as standardized measurement tools (e.g., WGCTA, CCTT, CCTST, and CCTDI) and self-adapting measurement tools (e.g., modified or made by researchers); and subject areas were encoded according to the specific subjects used in the 36 included studies.
The data information contained three metrics for measuring critical thinking: sample size, average value, and standard deviation. It is vital to remember that studies with various experimental designs frequently adopt various formulas to determine the effect size. And this paper used Morris’ proposed standardized mean difference (SMD) calculation formula ( 2008 , p. 369; see Supplementary Table S3 ).
According to the data coding template (see Table 1 ), the 36 papers’ information was retrieved by two researchers, who then entered them into Excel (see Supplementary Table S1 ). The results of each study were extracted separately in the data extraction procedure if an article contained numerous studies on critical thinking, or if a study assessed different critical thinking dimensions. For instance, Tiwari et al. ( 2010 ) used four time points, which were viewed as numerous different studies, to examine the outcomes of critical thinking, and Chen ( 2013 ) included the two outcome variables of attitudinal tendency and cognitive skills, which were regarded as two studies. After discussion and negotiation during data extraction, the two researchers’ consistency test coefficients were roughly 93.27%. Supplementary Table S2 details the key characteristics of the 36 included articles with 79 effect quantities, including descriptive information (e.g., the publishing year, author, serial number, and title of the paper), variable information (e.g., independent variables, dependent variables, and moderating variables), and data information (e.g., mean values, standard deviations, and sample size). Following that, testing for publication bias and heterogeneity was done on the sample data using the Rev-Man 5.4 software, and then the test results were used to conduct a meta-analysis.
When the sample of studies included in a meta-analysis does not accurately reflect the general status of research on the relevant subject, publication bias is said to be exhibited in this research. The reliability and accuracy of the meta-analysis may be impacted by publication bias. Due to this, the meta-analysis needs to check the sample data for publication bias (Stewart et al., 2006 ). A popular method to check for publication bias is the funnel plot; and it is unlikely that there will be publishing bias when the data are equally dispersed on either side of the average effect size and targeted within the higher region. The data are equally dispersed within the higher portion of the efficient zone, consistent with the funnel plot connected with this analysis (see Fig. 2 ), indicating that publication bias is unlikely in this situation.
This funnel plot shows the result of publication bias of 79 effect quantities across 36 studies.
To select the appropriate effect models for the meta-analysis, one might use the results of a heterogeneity test on the data effect sizes. In a meta-analysis, it is common practice to gauge the degree of data heterogeneity using the I 2 value, and I 2 ≥ 50% is typically understood to denote medium-high heterogeneity, which calls for the adoption of a random effect model; if not, a fixed effect model ought to be applied (Lipsey and Wilson, 2001 ). The findings of the heterogeneity test in this paper (see Table 2 ) revealed that I 2 was 86% and displayed significant heterogeneity ( P < 0.01). To ensure accuracy and reliability, the overall effect size ought to be calculated utilizing the random effect model.
This meta-analysis utilized a random effect model to examine 79 effect quantities from 36 studies after eliminating heterogeneity. In accordance with Cohen’s criterion (Cohen, 1992 ), it is abundantly clear from the analysis results, which are shown in the forest plot of the overall effect (see Fig. 3 ), that the cumulative impact size of cooperative problem-solving is 0.82, which is statistically significant ( z = 12.78, P < 0.01, 95% CI [0.69, 0.95]), and can encourage learners to practice critical thinking.
This forest plot shows the analysis result of the overall effect size across 36 studies.
In addition, this study examined two distinct dimensions of critical thinking to better understand the precise contributions that collaborative problem-solving makes to the growth of critical thinking. The findings (see Table 3 ) indicate that collaborative problem-solving improves cognitive skills (ES = 0.70) and attitudinal tendency (ES = 1.17), with significant intergroup differences (chi 2 = 7.95, P < 0.01). Although collaborative problem-solving improves both dimensions of critical thinking, it is essential to point out that the improvements in students’ attitudinal tendency are much more pronounced and have a significant comprehensive effect (ES = 1.17, z = 7.62, P < 0.01, 95% CI [0.87, 1.47]), whereas gains in learners’ cognitive skill are slightly improved and are just above average. (ES = 0.70, z = 11.55, P < 0.01, 95% CI [0.58, 0.82]).
The whole forest plot’s 79 effect quantities underwent a two-tailed test, which revealed significant heterogeneity ( I 2 = 86%, z = 12.78, P < 0.01), indicating differences between various effect sizes that may have been influenced by moderating factors other than sampling error. Therefore, exploring possible moderating factors that might produce considerable heterogeneity was done using subgroup analysis, such as the learning stage, learning scaffold, teaching type, group size, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, in order to further explore the key factors that influence critical thinking. The findings (see Table 4 ) indicate that various moderating factors have advantageous effects on critical thinking. In this situation, the subject area (chi 2 = 13.36, P < 0.05), group size (chi 2 = 8.77, P < 0.05), intervention duration (chi 2 = 12.18, P < 0.01), learning scaffold (chi 2 = 9.03, P < 0.01), and teaching type (chi 2 = 7.20, P < 0.05) are all significant moderators that can be applied to support the cultivation of critical thinking. However, since the learning stage and the measuring tools did not significantly differ among intergroup (chi 2 = 3.15, P = 0.21 > 0.05, and chi 2 = 0.08, P = 0.78 > 0.05), we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving. These are the precise outcomes, as follows:
Various learning stages influenced critical thinking positively, without significant intergroup differences (chi 2 = 3.15, P = 0.21 > 0.05). High school was first on the list of effect sizes (ES = 1.36, P < 0.01), then higher education (ES = 0.78, P < 0.01), and middle school (ES = 0.73, P < 0.01). These results show that, despite the learning stage’s beneficial influence on cultivating learners’ critical thinking, we are unable to explain why it is essential for cultivating critical thinking in the context of collaborative problem-solving.
Different teaching types had varying degrees of positive impact on critical thinking, with significant intergroup differences (chi 2 = 7.20, P < 0.05). The effect size was ranked as follows: mixed courses (ES = 1.34, P < 0.01), integrated courses (ES = 0.81, P < 0.01), and independent courses (ES = 0.27, P < 0.01). These results indicate that the most effective approach to cultivate critical thinking utilizing collaborative problem solving is through the teaching type of mixed courses.
Various intervention durations significantly improved critical thinking, and there were significant intergroup differences (chi 2 = 12.18, P < 0.01). The effect sizes related to this variable showed a tendency to increase with longer intervention durations. The improvement in critical thinking reached a significant level (ES = 0.85, P < 0.01) after more than 12 weeks of training. These findings indicate that the intervention duration and critical thinking’s impact are positively correlated, with a longer intervention duration having a greater effect.
Different learning scaffolds influenced critical thinking positively, with significant intergroup differences (chi 2 = 9.03, P < 0.01). The resource-supported learning scaffold (ES = 0.69, P < 0.01) acquired a medium-to-higher level of impact, the technique-supported learning scaffold (ES = 0.63, P < 0.01) also attained a medium-to-higher level of impact, and the teacher-supported learning scaffold (ES = 0.92, P < 0.01) displayed a high level of significant impact. These results show that the learning scaffold with teacher support has the greatest impact on cultivating critical thinking.
Various group sizes influenced critical thinking positively, and the intergroup differences were statistically significant (chi 2 = 8.77, P < 0.05). Critical thinking showed a general declining trend with increasing group size. The overall effect size of 2–3 people in this situation was the biggest (ES = 0.99, P < 0.01), and when the group size was greater than 7 people, the improvement in critical thinking was at the lower-middle level (ES < 0.5, P < 0.01). These results show that the impact on critical thinking is positively connected with group size, and as group size grows, so does the overall impact.
Various measuring tools influenced critical thinking positively, with significant intergroup differences (chi 2 = 0.08, P = 0.78 > 0.05). In this situation, the self-adapting measurement tools obtained an upper-medium level of effect (ES = 0.78), whereas the complete effect size of the standardized measurement tools was the largest, achieving a significant level of effect (ES = 0.84, P < 0.01). These results show that, despite the beneficial influence of the measuring tool on cultivating critical thinking, we are unable to explain why it is crucial in fostering the growth of critical thinking by utilizing the approach of collaborative problem-solving.
Different subject areas had a greater impact on critical thinking, and the intergroup differences were statistically significant (chi 2 = 13.36, P < 0.05). Mathematics had the greatest overall impact, achieving a significant level of effect (ES = 1.68, P < 0.01), followed by science (ES = 1.25, P < 0.01) and medical science (ES = 0.87, P < 0.01), both of which also achieved a significant level of effect. Programming technology was the least effective (ES = 0.39, P < 0.01), only having a medium-low degree of effect compared to education (ES = 0.72, P < 0.01) and other fields (such as language, art, and social sciences) (ES = 0.58, P < 0.01). These results suggest that scientific fields (e.g., mathematics, science) may be the most effective subject areas for cultivating critical thinking utilizing the approach of collaborative problem-solving.
According to this meta-analysis, using collaborative problem-solving as an intervention strategy in critical thinking teaching has a considerable amount of impact on cultivating learners’ critical thinking as a whole and has a favorable promotional effect on the two dimensions of critical thinking. According to certain studies, collaborative problem solving, the most frequently used critical thinking teaching strategy in curriculum instruction can considerably enhance students’ critical thinking (e.g., Liang et al., 2017 ; Liu et al., 2020 ; Cindy, 2004 ). This meta-analysis provides convergent data support for the above research views. Thus, the findings of this meta-analysis not only effectively address the first research query regarding the overall effect of cultivating critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills) utilizing the approach of collaborative problem-solving, but also enhance our confidence in cultivating critical thinking by using collaborative problem-solving intervention approach in the context of classroom teaching.
Furthermore, the associated improvements in attitudinal tendency are much stronger, but the corresponding improvements in cognitive skill are only marginally better. According to certain studies, cognitive skill differs from the attitudinal tendency in classroom instruction; the cultivation and development of the former as a key ability is a process of gradual accumulation, while the latter as an attitude is affected by the context of the teaching situation (e.g., a novel and exciting teaching approach, challenging and rewarding tasks) (Halpern, 2001 ; Wei and Hong, 2022 ). Collaborative problem-solving as a teaching approach is exciting and interesting, as well as rewarding and challenging; because it takes the learners as the focus and examines problems with poor structure in real situations, and it can inspire students to fully realize their potential for problem-solving, which will significantly improve their attitudinal tendency toward solving problems (Liu et al., 2020 ). Similar to how collaborative problem-solving influences attitudinal tendency, attitudinal tendency impacts cognitive skill when attempting to solve a problem (Liu et al., 2020 ; Zhang et al., 2022 ), and stronger attitudinal tendencies are associated with improved learning achievement and cognitive ability in students (Sison, 2008 ; Zhang et al., 2022 ). It can be seen that the two specific dimensions of critical thinking as well as critical thinking as a whole are affected by collaborative problem-solving, and this study illuminates the nuanced links between cognitive skills and attitudinal tendencies with regard to these two dimensions of critical thinking. To fully develop students’ capacity for critical thinking, future empirical research should pay closer attention to cognitive skills.
In order to further explore the key factors that influence critical thinking, exploring possible moderating effects that might produce considerable heterogeneity was done using subgroup analysis. The findings show that the moderating factors, such as the teaching type, learning stage, group size, learning scaffold, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, could all support the cultivation of collaborative problem-solving in critical thinking. Among them, the effect size differences between the learning stage and measuring tool are not significant, which does not explain why these two factors are crucial in supporting the cultivation of critical thinking utilizing the approach of collaborative problem-solving.
In terms of the learning stage, various learning stages influenced critical thinking positively without significant intergroup differences, indicating that we are unable to explain why it is crucial in fostering the growth of critical thinking.
Although high education accounts for 70.89% of all empirical studies performed by researchers, high school may be the appropriate learning stage to foster students’ critical thinking by utilizing the approach of collaborative problem-solving since it has the largest overall effect size. This phenomenon may be related to student’s cognitive development, which needs to be further studied in follow-up research.
With regard to teaching type, mixed course teaching may be the best teaching method to cultivate students’ critical thinking. Relevant studies have shown that in the actual teaching process if students are trained in thinking methods alone, the methods they learn are isolated and divorced from subject knowledge, which is not conducive to their transfer of thinking methods; therefore, if students’ thinking is trained only in subject teaching without systematic method training, it is challenging to apply to real-world circumstances (Ruggiero, 2012 ; Hu and Liu, 2015 ). Teaching critical thinking as mixed course teaching in parallel to other subject teachings can achieve the best effect on learners’ critical thinking, and explicit critical thinking instruction is more effective than less explicit critical thinking instruction (Bensley and Spero, 2014 ).
In terms of the intervention duration, with longer intervention times, the overall effect size shows an upward tendency. Thus, the intervention duration and critical thinking’s impact are positively correlated. Critical thinking, as a key competency for students in the 21st century, is difficult to get a meaningful improvement in a brief intervention duration. Instead, it could be developed over a lengthy period of time through consistent teaching and the progressive accumulation of knowledge (Halpern, 2001 ; Hu and Liu, 2015 ). Therefore, future empirical studies ought to take these restrictions into account throughout a longer period of critical thinking instruction.
With regard to group size, a group size of 2–3 persons has the highest effect size, and the comprehensive effect size decreases with increasing group size in general. This outcome is in line with some research findings; as an example, a group composed of two to four members is most appropriate for collaborative learning (Schellens and Valcke, 2006 ). However, the meta-analysis results also indicate that once the group size exceeds 7 people, small groups cannot produce better interaction and performance than large groups. This may be because the learning scaffolds of technique support, resource support, and teacher support improve the frequency and effectiveness of interaction among group members, and a collaborative group with more members may increase the diversity of views, which is helpful to cultivate critical thinking utilizing the approach of collaborative problem-solving.
With regard to the learning scaffold, the three different kinds of learning scaffolds can all enhance critical thinking. Among them, the teacher-supported learning scaffold has the largest overall effect size, demonstrating the interdependence of effective learning scaffolds and collaborative problem-solving. This outcome is in line with some research findings; as an example, a successful strategy is to encourage learners to collaborate, come up with solutions, and develop critical thinking skills by using learning scaffolds (Reiser, 2004 ; Xu et al., 2022 ); learning scaffolds can lower task complexity and unpleasant feelings while also enticing students to engage in learning activities (Wood et al., 2006 ); learning scaffolds are designed to assist students in using learning approaches more successfully to adapt the collaborative problem-solving process, and the teacher-supported learning scaffolds have the greatest influence on critical thinking in this process because they are more targeted, informative, and timely (Xu et al., 2022 ).
With respect to the measuring tool, despite the fact that standardized measurement tools (such as the WGCTA, CCTT, and CCTST) have been acknowledged as trustworthy and effective by worldwide experts, only 54.43% of the research included in this meta-analysis adopted them for assessment, and the results indicated no intergroup differences. These results suggest that not all teaching circumstances are appropriate for measuring critical thinking using standardized measurement tools. “The measuring tools for measuring thinking ability have limits in assessing learners in educational situations and should be adapted appropriately to accurately assess the changes in learners’ critical thinking.”, according to Simpson and Courtney ( 2002 , p. 91). As a result, in order to more fully and precisely gauge how learners’ critical thinking has evolved, we must properly modify standardized measuring tools based on collaborative problem-solving learning contexts.
With regard to the subject area, the comprehensive effect size of science departments (e.g., mathematics, science, medical science) is larger than that of language arts and social sciences. Some recent international education reforms have noted that critical thinking is a basic part of scientific literacy. Students with scientific literacy can prove the rationality of their judgment according to accurate evidence and reasonable standards when they face challenges or poorly structured problems (Kyndt et al., 2013 ), which makes critical thinking crucial for developing scientific understanding and applying this understanding to practical problem solving for problems related to science, technology, and society (Yore et al., 2007 ).
Other than those stated in the discussion above, the following suggestions are offered for critical thinking instruction utilizing the approach of collaborative problem-solving.
First, teachers should put a special emphasis on the two core elements, which are collaboration and problem-solving, to design real problems based on collaborative situations. This meta-analysis provides evidence to support the view that collaborative problem-solving has a strong synergistic effect on promoting students’ critical thinking. Asking questions about real situations and allowing learners to take part in critical discussions on real problems during class instruction are key ways to teach critical thinking rather than simply reading speculative articles without practice (Mulnix, 2012 ). Furthermore, the improvement of students’ critical thinking is realized through cognitive conflict with other learners in the problem situation (Yang et al., 2008 ). Consequently, it is essential for teachers to put a special emphasis on the two core elements, which are collaboration and problem-solving, and design real problems and encourage students to discuss, negotiate, and argue based on collaborative problem-solving situations.
Second, teachers should design and implement mixed courses to cultivate learners’ critical thinking, utilizing the approach of collaborative problem-solving. Critical thinking can be taught through curriculum instruction (Kuncel, 2011 ; Leng and Lu, 2020 ), with the goal of cultivating learners’ critical thinking for flexible transfer and application in real problem-solving situations. This meta-analysis shows that mixed course teaching has a highly substantial impact on the cultivation and promotion of learners’ critical thinking. Therefore, teachers should design and implement mixed course teaching with real collaborative problem-solving situations in combination with the knowledge content of specific disciplines in conventional teaching, teach methods and strategies of critical thinking based on poorly structured problems to help students master critical thinking, and provide practical activities in which students can interact with each other to develop knowledge construction and critical thinking utilizing the approach of collaborative problem-solving.
Third, teachers should be more trained in critical thinking, particularly preservice teachers, and they also should be conscious of the ways in which teachers’ support for learning scaffolds can promote critical thinking. The learning scaffold supported by teachers had the greatest impact on learners’ critical thinking, in addition to being more directive, targeted, and timely (Wood et al., 2006 ). Critical thinking can only be effectively taught when teachers recognize the significance of critical thinking for students’ growth and use the proper approaches while designing instructional activities (Forawi, 2016 ). Therefore, with the intention of enabling teachers to create learning scaffolds to cultivate learners’ critical thinking utilizing the approach of collaborative problem solving, it is essential to concentrate on the teacher-supported learning scaffolds and enhance the instruction for teaching critical thinking to teachers, especially preservice teachers.
There are certain limitations in this meta-analysis, but future research can correct them. First, the search languages were restricted to English and Chinese, so it is possible that pertinent studies that were written in other languages were overlooked, resulting in an inadequate number of articles for review. Second, these data provided by the included studies are partially missing, such as whether teachers were trained in the theory and practice of critical thinking, the average age and gender of learners, and the differences in critical thinking among learners of various ages and genders. Third, as is typical for review articles, more studies were released while this meta-analysis was being done; therefore, it had a time limit. With the development of relevant research, future studies focusing on these issues are highly relevant and needed.
The subject of the magnitude of collaborative problem-solving’s impact on fostering students’ critical thinking, which received scant attention from other studies, was successfully addressed by this study. The question of the effectiveness of collaborative problem-solving in promoting students’ critical thinking was addressed in this study, which addressed a topic that had gotten little attention in earlier research. The following conclusions can be made:
Regarding the results obtained, collaborative problem solving is an effective teaching approach to foster learners’ critical thinking, with a significant overall effect size (ES = 0.82, z = 12.78, P < 0.01, 95% CI [0.69, 0.95]). With respect to the dimensions of critical thinking, collaborative problem-solving can significantly and effectively improve students’ attitudinal tendency, and the comprehensive effect is significant (ES = 1.17, z = 7.62, P < 0.01, 95% CI [0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z = 11.55, P < 0.01, 95% CI [0.58, 0.82]).
As demonstrated by both the results and the discussion, there are varying degrees of beneficial effects on students’ critical thinking from all seven moderating factors, which were found across 36 studies. In this context, the teaching type (chi 2 = 7.20, P < 0.05), intervention duration (chi 2 = 12.18, P < 0.01), subject area (chi 2 = 13.36, P < 0.05), group size (chi 2 = 8.77, P < 0.05), and learning scaffold (chi 2 = 9.03, P < 0.01) all have a positive impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. Since the learning stage (chi 2 = 3.15, P = 0.21 > 0.05) and measuring tools (chi 2 = 0.08, P = 0.78 > 0.05) did not demonstrate any significant intergroup differences, we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving.
All data generated or analyzed during this study are included within the article and its supplementary information files, and the supplementary information files are available in the Dataverse repository: https://doi.org/10.7910/DVN/IPFJO6 .
Bensley DA, Spero RA (2014) Improving critical thinking skills and meta-cognitive monitoring through direct infusion. Think Skills Creat 12:55–68. https://doi.org/10.1016/j.tsc.2014.02.001
Article Google Scholar
Castle A (2009) Defining and assessing critical thinking skills for student radiographers. Radiography 15(1):70–76. https://doi.org/10.1016/j.radi.2007.10.007
Chen XD (2013) An empirical study on the influence of PBL teaching model on critical thinking ability of non-English majors. J PLA Foreign Lang College 36 (04):68–72
Google Scholar
Cohen A (1992) Antecedents of organizational commitment across occupational groups: a meta-analysis. J Organ Behav. https://doi.org/10.1002/job.4030130602
Cooper H (2010) Research synthesis and meta-analysis: a step-by-step approach, 4th edn. Sage, London, England
Cindy HS (2004) Problem-based learning: what and how do students learn? Educ Psychol Rev 51(1):31–39
Duch BJ, Gron SD, Allen DE (2001) The power of problem-based learning: a practical “how to” for teaching undergraduate courses in any discipline. Stylus Educ Sci 2:190–198
Ennis RH (1989) Critical thinking and subject specificity: clarification and needed research. Educ Res 18(3):4–10. https://doi.org/10.3102/0013189x018003004
Facione PA (1990) Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations. Eric document reproduction service. https://eric.ed.gov/?id=ed315423
Facione PA, Facione NC (1992) The California Critical Thinking Dispositions Inventory (CCTDI) and the CCTDI test manual. California Academic Press, Millbrae, CA
Forawi SA (2016) Standard-based science education and critical thinking. Think Skills Creat 20:52–62. https://doi.org/10.1016/j.tsc.2016.02.005
Halpern DF (2001) Assessing the effectiveness of critical thinking instruction. J Gen Educ 50(4):270–286. https://doi.org/10.2307/27797889
Hu WP, Liu J (2015) Cultivation of pupils’ thinking ability: a five-year follow-up study. Psychol Behav Res 13(05):648–654. https://doi.org/10.3969/j.issn.1672-0628.2015.05.010
Huber K (2016) Does college teach critical thinking? A meta-analysis. Rev Educ Res 86(2):431–468. https://doi.org/10.3102/0034654315605917
Kek MYCA, Huijser H (2011) The power of problem-based learning in developing critical thinking skills: preparing students for tomorrow’s digital futures in today’s classrooms. High Educ Res Dev 30(3):329–341. https://doi.org/10.1080/07294360.2010.501074
Kuncel NR (2011) Measurement and meaning of critical thinking (Research report for the NRC 21st Century Skills Workshop). National Research Council, Washington, DC
Kyndt E, Raes E, Lismont B, Timmers F, Cascallar E, Dochy F (2013) A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings? Educ Res Rev 10(2):133–149. https://doi.org/10.1016/j.edurev.2013.02.002
Leng J, Lu XX (2020) Is critical thinking really teachable?—A meta-analysis based on 79 experimental or quasi experimental studies. Open Educ Res 26(06):110–118. https://doi.org/10.13966/j.cnki.kfjyyj.2020.06.011
Liang YZ, Zhu K, Zhao CL (2017) An empirical study on the depth of interaction promoted by collaborative problem solving learning activities. J E-educ Res 38(10):87–92. https://doi.org/10.13811/j.cnki.eer.2017.10.014
Lipsey M, Wilson D (2001) Practical meta-analysis. International Educational and Professional, London, pp. 92–160
Liu Z, Wu W, Jiang Q (2020) A study on the influence of problem based learning on college students’ critical thinking-based on a meta-analysis of 31 studies. Explor High Educ 03:43–49
Morris SB (2008) Estimating effect sizes from pretest-posttest-control group designs. Organ Res Methods 11(2):364–386. https://doi.org/10.1177/1094428106291059
Article ADS Google Scholar
Mulnix JW (2012) Thinking critically about critical thinking. Educ Philos Theory 44(5):464–479. https://doi.org/10.1111/j.1469-5812.2010.00673.x
Naber J, Wyatt TH (2014) The effect of reflective writing interventions on the critical thinking skills and dispositions of baccalaureate nursing students. Nurse Educ Today 34(1):67–72. https://doi.org/10.1016/j.nedt.2013.04.002
National Research Council (2012) Education for life and work: developing transferable knowledge and skills in the 21st century. The National Academies Press, Washington, DC
Niu L, Behar HLS, Garvan CW (2013) Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educ Res Rev 9(12):114–128. https://doi.org/10.1016/j.edurev.2012.12.002
Peng ZM, Deng L (2017) Towards the core of education reform: cultivating critical thinking skills as the core of skills in the 21st century. Res Educ Dev 24:57–63. https://doi.org/10.14121/j.cnki.1008-3855.2017.24.011
Reiser BJ (2004) Scaffolding complex learning: the mechanisms of structuring and problematizing student work. J Learn Sci 13(3):273–304. https://doi.org/10.1207/s15327809jls1303_2
Ruggiero VR (2012) The art of thinking: a guide to critical and creative thought, 4th edn. Harper Collins College Publishers, New York
Schellens T, Valcke M (2006) Fostering knowledge construction in university students through asynchronous discussion groups. Comput Educ 46(4):349–370. https://doi.org/10.1016/j.compedu.2004.07.010
Sendag S, Odabasi HF (2009) Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Comput Educ 53(1):132–141. https://doi.org/10.1016/j.compedu.2009.01.008
Sison R (2008) Investigating Pair Programming in a Software Engineering Course in an Asian Setting. 2008 15th Asia-Pacific Software Engineering Conference, pp. 325–331. https://doi.org/10.1109/APSEC.2008.61
Simpson E, Courtney M (2002) Critical thinking in nursing education: literature review. Mary Courtney 8(2):89–98
Stewart L, Tierney J, Burdett S (2006) Do systematic reviews based on individual patient data offer a means of circumventing biases associated with trial publications? Publication bias in meta-analysis. John Wiley and Sons Inc, New York, pp. 261–286
Tiwari A, Lai P, So M, Yuen K (2010) A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med Educ 40(6):547–554. https://doi.org/10.1111/j.1365-2929.2006.02481.x
Wood D, Bruner JS, Ross G (2006) The role of tutoring in problem solving. J Child Psychol Psychiatry 17(2):89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x
Wei T, Hong S (2022) The meaning and realization of teachable critical thinking. Educ Theory Practice 10:51–57
Xu EW, Wang W, Wang QX (2022) A meta-analysis of the effectiveness of programming teaching in promoting K-12 students’ computational thinking. Educ Inf Technol. https://doi.org/10.1007/s10639-022-11445-2
Yang YC, Newby T, Bill R (2008) Facilitating interactions through structured web-based bulletin boards: a quasi-experimental study on promoting learners’ critical thinking skills. Comput Educ 50(4):1572–1585. https://doi.org/10.1016/j.compedu.2007.04.006
Yore LD, Pimm D, Tuan HL (2007) The literacy component of mathematical and scientific literacy. Int J Sci Math Educ 5(4):559–589. https://doi.org/10.1007/s10763-007-9089-4
Zhang T, Zhang S, Gao QQ, Wang JH (2022) Research on the development of learners’ critical thinking in online peer review. Audio Visual Educ Res 6:53–60. https://doi.org/10.13811/j.cnki.eer.2022.06.08
Download references
This research was supported by the graduate scientific research and innovation project of Xinjiang Uygur Autonomous Region named “Research on in-depth learning of high school information technology courses for the cultivation of computing thinking” (No. XJ2022G190) and the independent innovation fund project for doctoral students of the College of Educational Science of Xinjiang Normal University named “Research on project-based teaching of high school information technology courses from the perspective of discipline core literacy” (No. XJNUJKYA2003).
Authors and affiliations.
College of Educational Science, Xinjiang Normal University, 830017, Urumqi, Xinjiang, China
Enwei Xu, Wei Wang & Qingxia Wang
You can also search for this author in PubMed Google Scholar
Correspondence to Enwei Xu or Wei Wang .
Competing interests.
The authors declare no competing interests.
This article does not contain any studies with human participants performed by any of the authors.
Additional information.
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary tables, rights and permissions.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .
Reprints and permissions
Cite this article.
Xu, E., Wang, W. & Wang, Q. The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature. Humanit Soc Sci Commun 10 , 16 (2023). https://doi.org/10.1057/s41599-023-01508-1
Download citation
Received : 07 August 2022
Accepted : 04 January 2023
Published : 11 January 2023
DOI : https://doi.org/10.1057/s41599-023-01508-1
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
Impacts of online collaborative learning on students’ intercultural communication apprehension and intercultural communicative competence.
Education and Information Technologies (2024)
The impacts of computer-supported collaborative learning on students’ critical thinking: a meta-analysis.
International Journal of Environmental Science and Technology (2024)
Search code, repositories, users, issues, pull requests..., provide feedback.
We read every piece of feedback, and take your input very seriously.
Use saved searches to filter your results more quickly.
To see all available qualifiers, see our documentation .
ECPC2022 problems set with solutions
Folders and files.
Name | Name | |||
---|---|---|---|---|
30 Commits | ||||
This Repo contains problems set of ECPC 2022 (Egyptian Collegiate Programming Contest) and it's solutions. The code is written in c++ programming language. If you have a better solution just pull a request.
World finals.
Each year our International Conference, and highest level competition, brings together over 2,000 champion problem solvers and supporters from around the world.
Participants will gather on the campus of Indiana University Bloomington in 2025.
The event connects a diverse group of passionate, dedicated, and successful students, coaches, staff, evaluators, parents, volunteers, alumni, and supporters.
We invite all regional affiliate champions to attend our in-person International Conference and compete alongside their peers.
Problem solvers addressed Air Quality in 2024. We will announce this year's International Conference topic area March 1.
Visit our Resource Library for competition details, registration information, the schedule, FAQs, and more.
A lifetime moment.
2,000+ participants, 14+ countries, 34+ u.s. states, special events, magic and fps experience opportunities.
Student showcase.
Global Issues 2022 Middle Team Champion Caroline, Sijia, Siqi, and Xin (Singapore)
Community Project 2023 Beyonder Award Winner Fatuma and Shyla (Minnesota)
Creative Writing 2020 Senior Champion Kaitlyn (Illinois)
A comparison of reading and mathematics performance between students participating in a future problem solving program and nonparticipants.
Data from the The Minnesota Comprehensive Assessment (MCA) was collected by Grandview Middle School and provided to Scholastic Testing Service, Inc. for statistical analysis.
Findings reported by Scholastic Testing Service, Inc. Performance data on the MCA was collected from 2010-2014 for students in grade 6 at Grandview Middle School in Mound, MN (Westonka Public School District). Students were identified as either FPS: students participating in a Future Problem Solving program, or Non-FPS: students not participating in the program. Summary statistics using Reading and Mathematics Scaled Scores were developed for each group of students by year and across years. To determine if the mean scores across the years were significantly different, t-tests were used. A Cohen’s d test was then performed to measure the effect of the size of the found differences.
The journal of creative behavior (jcb) of the creative education foundation.
Seventy-five participants from one suburban high school formed 21 teams with 3–4 members each for Future Problem Solving (FPS). Students were selected to participate in either the regular FPS or an enhanced FPS, where multiple group training activities grounded in problem-solving style were incorporated into a 9-week treatment period.
An ANCOVA procedure was used to examine the difference in team responses to a creative problem-solving scenario for members of each group, after accounting for initial differences in creative problem-solving performance, years of experience in FPS, and creative thinking related to fluency, flexibility, and originality. The ANCOVA resulted in a significant difference in problem-solving performance in favor of students in the treatment group (F(1, 57) = 8.21, p = .006, partial eta squared = .126, medium), while there were no significant differences in years of experience or creativity scores. This result led researchers to conclude that students in both groups had equivalent creative ability and that participation in the group activities emphasizing problem-solving style significantly contributed to creative performance.
In the comparison group, a total of 47% had scores that qualified for entry to the state competition. In contrast, 89% of the students in the treatment group had scores that qualified them for the state bowl. None of the teams from the comparison group qualified for the international competition, while two teams from the treatment group were selected, with one earning sixth place.
The Journal of Creative Behavior, Vol. 0, Iss. 0, pp. 1–12 © 2017 by the Creative Education Foundation, Inc. DOI: 10.1002/jocb.176
“how important was future problem solving in the development of your following skill sets”.
In 2011, a team of researchers from the University of Virginia submitted a report titled “Future Problem Solving Program International—Second Generation Study.” (Callahan, Alimin, & Uguz, 2012). The study, based on a survey, collected data from over 150 Future Problem Solving alumni to understand the impact of their participation in Future Problem Solving as students or volunteers.
A seasoned educator, April Michele has served as the Executive Director since 2018 and been with Future Problem Solving more than a decade. Her background in advanced curriculum strategies and highly engaging learning techniques translates well in the development of materials, publications, training, and marketing for the organization and its global network. April’s expertise includes pedagogy and strategies for critical and creative thinking and providing quality educational services for students and adults worldwide.
Prior to joining Future Problem Solving, April taught elementary and middle grades, spending most of her classroom career in gifted education. She earned the National Board certification (NBPTS) as a Middle Childhood/Generalist and later served as a National Board assessor for the certification of others. In addition, April facilitated the Theory and Development of Creativity course for the state of Florida’s certification of teachers. She has also collaborated on a variety of special projects through the Department of Education. Beyond her U.S. education credentials, she has been trained for the International Baccalaureate Middle Years Programme (MYP) in Humanities.
A graduate of the University of Central Florida with a bachelor’s in Elementary Education and the University of South Florida with a master’s in Gifted Education, April’s passion is providing a challenging curriculum for 21st century students so they are equipped with the problem-solving and ethical leadership skills they need to thrive in the future. As a board member in her local Rotary Club, she facilitates problem solving in leadership at the Rotary Youth Leadership Awards (RYLA). She is also a certified Project Management Professional (PMP) from the Project Management Institute and earned her certificate in Nonprofit Management from the Edyth Bush Institute at Rollins College.
IMAGES
COMMENTS
1. quattro problemi formulati in italiano e scelti, di volta in volta, tra l'insieme dei "Problemi ricorrenti" (si veda il successivo elenco); 2. tre problemi formulati in italiano e relativi a un o pseudo-linguaggio di programmazione; 3. un problema formulato in inglese, di argomento ogni volta diverso (almeno in linea di principio).
OPS - Olimpiadi di Problem Solving. Incontro "Intelligenza Artificiale, a Scuola (?)" - Seminario e Dibattito. Venerdì 24 Maggio - dalle 9:00 alle 12:00, presso il Campus di Cesena e online. Scarica la brochure.
Olimpiadi di Problem Solving - Edizione 2022 Rivolte agli alunni degli Istituti di Istruzione di ogni ordine e grado statali e paritarie. Mercoledì, 01 dicembre 2021 La DGOSVI del MI promuove le Olimpiadi di Problem Solving, progetto di informatica che promuove competenze chiave per la soluzione di problemi attraverso modelli, metodi e ...
2022 AMC 8 problems and solutions. THE TEST WAS HELD BETWEEN JANUARY 18, 2022 AND JANUARY 24, 2022. The first link contains the full set of test problems. The rest contain each individual problem and its solution. 2022 AMC 8 Problems.
Answer: The unique solution is the function f ( x) = 1 x for every x ∈ R +. This function clearly satisfies the required property since the expression x f ( y) + y f ( x) = x y + y x is greater than 2 for every y ≠ x (directly from AM-GM) and equal to 2 (with equality) for the unique value y = x . Proof: Let's consider a solution based on ...
We'll outline that process here and then follow with techniques you can use to explore and work on that step of the problem solving process with a group. The seven-step problem solving process is: 1. Problem identification. The first stage of any problem solving process is to identify the problem (s) you need to solve.
2022 IMO Problems/Problem 5. Contents. 1 Problem; 2 Video solution; 3 Solution; 4 Solution 2; 5 See Also; Problem. Find all triples of positive integers with prime and ... Art of Problem Solving is an ACS WASC Accredited School. aops programs. AoPS Online. Beast Academy. AoPS Academy. About. About AoPS. Our Team. Our History. Jobs. AoPS Blog ...
Il Ministero dell'Istruzione - Direzione Generale per gli ordinamenti scolastici, la valutazione e l'internazionalizzazione del sistema nazionale d'istruzione - promuove per l'anno scolastico 2021-2022, il progetto di informatica "Olimpiadi di Problem Solving", che incentiva competenze chiave per la soluzione di problemi attraverso modelli, metodi e strumenti informatici.
2022 AMC 10B Problems/Problem 10. The following problem is from both the 2022 AMC 10B #10 and 2022 AMC 12B #7, so both problems redirect to this page. Contents. ... Art of Problem Solving is an ACS WASC Accredited School. aops programs. AoPS Online. Beast Academy. AoPS Academy. About. About AoPS. Our Team. Our History. Jobs. AoPS Blog. Site Info.
Each year 30,000+ K-12 students in 34+ U.S. states and 14 countries participate in a variety of Future Problem Solving challenges. Each year 30,000+ K-12 students from around the world participate in a variety of Future Problem Solving real world challenges. ... 2022 Champion Junior Division (Community Projects) Our Topic Center. Real issues ...
1. quattro problemi formulati in italiano e scelti, di volta in volta, tra l'insieme dei "Problemi ricorrenti" (si veda il successivo elenco); 2. tre problemi formulati in italiano e relativi a uno pseudo-linguaggio di programmazione; 3. un problema formulato in inglese, di argomento ogni volta diverso (almeno in linea di principio).
In insight problem-solving, the cognitive processes that help you solve a problem happen outside your conscious awareness. 4. Working backward. Working backward is a problem-solving approach often ...
Olimpiadi di Problem Solving - Edizione 2022 Rivolte agli alunni degli Istituti di Istruzione di ogni ordine e grado statali e paritarie. ... DGOSVI del MI promuove le Olimpiadi di Problem Solving, progetto di informatica che promuove competenze chiave per la soluzione di problemi attraverso modelli, metodi e strumenti informatici.
Problem-solving is a mental process that involves discovering, analyzing, and solving problems. The ultimate goal of problem-solving is to overcome obstacles and find a solution that best resolves the issue. The best strategy for solving a problem depends largely on the unique situation. In some cases, people are better off learning everything ...
We welcome ideas for future competition topics from anyone in our global community. Submit a Topic. 2016-17, 2018-19, 2019-20, 2020-21, 2021-22, 2022-23, 2023-24 2024-25, Affiliate Finals Problem, International Conference, Practice Problem, Qualifying Problem, STEM Topics, World Solutions Challenge. Visit our topic center to learn more about ...
To discuss the art of problem solving, I sat down in California with McKinsey senior partner Hugo Sarrazin and also with Charles Conn. Charles is a former McKinsey partner, entrepreneur, executive, and coauthor of the book Bulletproof Problem Solving: The One Skill That Changes Everything [John Wiley & Sons, 2018].
The findings show that (1) collaborative problem solving is an effective teaching approach to foster students' critical thinking, with a significant overall effect size (ES = 0.82, z = 12.78, P ...
Similarly, problem 2020/3 was proposed by Hungary with one Hungarian and one non-Hungarian problem author. To the current moment, there is only a single IMO problem that has two distinct proposing countries: The if-part of problem 1994/2 was proposed by Australia and its only-if part by Armenia. See also. IMO problems statistics (eternal)
This Repo contains problems set of ECPC 2022 (Egyptian Collegiate Programming Contest) and it's solutions. The code is written in c++ programming language. If you have a better solution just pull a request. About. ECPC2022 problems set with solutions Resources. Readme Activity. Stars. 18 stars Watchers. 1 watching Forks.
PROBLEM: Your beloved cast iron pan is difficult to clean. SOLUTION: Wash off the grit AND preserve the flavor with soap-less, detergent-free grit removal. You don't need to fear your seasoning being stripped off by harsh soap or cleaners. With The Ringer, all you need is warm water and you're good to go. WATCH.
Those new to Future Problem Solving may register to attend as observers and participate in all the non-competition activities via our FPS Experience track. It includes a half-day introductory workshop on the 6-step problem-solving process. Contact us for more information. I remember 17 years ago when I made it to Internationals for the first time.
The annual Community Development Block Grant Disaster Recovery (CDBG-DR) Problem Solving Clinic (Clinic) is an opportunity for CDBG-DR and CDBG Mitigation (CDBG-MIT) grantees, as well as select subrecipients and grantee partners, to network and learn about: CDBG-DR specific rules and requirements. Requirements of Federal Register Notices ...