Search results for: probabilistic thinking
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1405

Search results for: probabilistic thinking

1135 Embodied Neoliberalism and the Mind as Tool to Manage the Body: A Descriptive Study Applied to Young Australian Amateur Athletes

Authors: Alicia Ettlin

Abstract:

Amid the rise of neoliberalism to the leading economic policy model in Western societies in the 1980s, people have started to internalise a neoliberal way of thinking, whereby the human body has become an entity that can and needs to be precisely managed through free yet rational decision-making processes. The neoliberal citizen has consequently become an entrepreneur of the self who is free, independent, rational, productive and responsible for themselves, their health and wellbeing as well as their appearance. The focus on individuals as entrepreneurs who manage their bodies through the rationally thinking mind has, however, become increasingly criticised for viewing the social actor as ‘disembodied’, as a detached, social actor whose powerful mind governs over the passive body. On the other hand, the discourse around embodiment seeks to connect rational decision-making processes to the dominant neoliberal discourse which creates an embodied understanding that the body, just as other areas of people’s lives, can and should be shaped, monitored and managed through cognitive and rational thinking. This perspective offers an understanding of the body regarding its connections with the social environment that reaches beyond the debates around mind-body binary thinking. Hence, following this argument, body management should not be thought of as either solely guided by embodied discourses nor as merely falling into a mind-body dualism, but rather, simultaneously and inseparably as both at once. The descriptive, qualitative analysis of semi-structured in-depth interviews conducted with young Australian amateur athletes between the age of 18 and 24 has shown that most participants are interested in measuring and managing their body to create self-knowledge and self-improvement. The participants thereby connected self-improvement to weight loss, muscle gain or simply staying fit and healthy. Self-knowledge refers to body measurements including weight, BMI or body fat percentage. Self-management and self-knowledge that are reliant on one another to take rational and well-thought-out decisions, are both characteristic values of the neoliberal doctrine. A neoliberal way of thinking and looking after the body has also by many been connected to rewarding themselves for their discipline, hard work or achievement of specific body management goals (e.g. eating chocolate for reaching the daily step count goal). A few participants, however, have shown resistance against these neoliberal values, and in particular, against the precise monitoring and management of the body with the help of self-tracking devices. Ultimately, however, it seems that most participants have internalised the dominant discourses around self-responsibility, and by association, a sense of duty to discipline their body in normative ways. Even those who have indicated their resistance against body work and body management practices that follow neoliberal thinking and measurement systems, are aware and have internalised the concept of the rational operating mind that needs or should decide how to look after the body in terms of health but also appearance ideals. The discussion around the collected data thereby shows that embodiment and the mind/body dualism constitute two connected, rather than two separate or opposing concepts.

Keywords: dualism, embodiment, mind, neoliberalism

Procedia PDF Downloads 160
1134 The Effect of Stigma on Attitudes towards Seeking Help from Social Workers

Authors: Hend Al-Ma'seb, Anwar Alkhurinej

Abstract:

In the field of social work, social workers understand that it is very difficult for individuals to ask for help from therapists. Therefore, it is important to study the variables associated with seeking professional help. A total of 478 undergraduate students from Kuwait University participated voluntarily in the study. The findings for this study showed that the participants of the study have a slightly high degree of public stigma, low self–stigma, and positive attitude toward seeking professional help. In addition, the findings of the study reveal that there are significant relationships between gender, taking social work classes, thinking about receiving counseling and having social problems and participants' attitude towards seeking professional help. Furthermore, the findings of the study showed that there were significant relationships between gender, and thinking about receiving counseling, and self-stigma. The findings of the current study have implications for the field of social work in Kuwait that would help to improve the knowledge in this area.

Keywords: attitude towards help, social work, social workers, stigma

Procedia PDF Downloads 203
1133 Probabilistic Crash Prediction and Prevention of Vehicle Crash

Authors: Lavanya Annadi, Fahimeh Jafari

Abstract:

Transportation brings immense benefits to society, but it also has its costs. Costs include such as the cost of infrastructure, personnel and equipment, but also the loss of life and property in traffic accidents on the road, delays in travel due to traffic congestion and various indirect costs in terms of air transport. More research has been done to identify the various factors that affect road accidents, such as road infrastructure, traffic, sociodemographic characteristics, land use, and the environment. The aim of this research is to predict the probabilistic crash prediction of vehicles using machine learning due to natural and structural reasons by excluding spontaneous reasons like overspeeding etc., in the United States. These factors range from weather factors, like weather conditions, precipitation, visibility, wind speed, wind direction, temperature, pressure, and humidity to human made structures like road structure factors like bump, roundabout, no exit, turning loop, give away, etc. Probabilities are dissected into ten different classes. All the predictions are based on multiclass classification techniques, which are supervised learning. This study considers all crashes that happened in all states collected by the US government. To calculate the probability, multinomial expected value was used and assigned a classification label as the crash probability. We applied three different classification models, including multiclass Logistic Regression, Random Forest and XGBoost. The numerical results show that XGBoost achieved a 75.2% accuracy rate which indicates the part that is being played by natural and structural reasons for the crash. The paper has provided in-deep insights through exploratory data analysis.

Keywords: road safety, crash prediction, exploratory analysis, machine learning

Procedia PDF Downloads 107
1132 Systems Intelligence in Management (High Performing Organizations and People Score High in Systems Intelligence)

Authors: Raimo P. Hämäläinen, Juha Törmänen, Esa Saarinen

Abstract:

Systems thinking has been acknowledged as an important approach in the strategy and management literature ever since the seminal works of Ackhoff in the 1970´s and Senge in the 1990´s. The early literature was very much focused on structures and organizational dynamics. Understanding systems is important but making improvements also needs ways to understand human behavior in systems. Peter Senge´s book The Fifth Discipline gave the inspiration to the development of the concept of Systems Intelligence. The concept integrates the concepts of personal mastery and systems thinking. SI refers to intelligent behavior in the context of complex systems involving interaction and feedback. It is a competence related to the skills needed in strategy and the environment of modern industrial engineering and management where people skills and systems are in an increasingly important role. The eight factors of Systems Intelligence have been identified from extensive surveys and the factors relate to perceiving, attitude, thinking and acting. The personal self-evaluation test developed consists of 32 items which can also be applied in a peer evaluation mode. The concept and test extend to organizations too. One can talk about organizational systems intelligence. This paper reports the results of an extensive survey based on peer evaluation. The results show that systems intelligence correlates positively with professional performance. People in a managerial role score higher in SI than others. Age improves the SI score but there is no gender difference. Top organizations score higher in all SI factors than lower ranked ones. The SI-tests can also be used as leadership and management development tools helping self-reflection and learning. Finding ways of enhancing learning organizational development is important. Today gamification is a new promising approach. The items in the SI test have been used to develop an interactive card game following the Topaasia game approach. It is an easy way of engaging people in a process which both helps participants see and approach problems in their organization. It also helps individuals in identifying challenges in their own behavior and in improving in their SI.

Keywords: gamification, management competence, organizational learning, systems thinking

Procedia PDF Downloads 93
1131 Risk Assessment of Flood Defences by Utilising Condition Grade Based Probabilistic Approach

Authors: M. Bahari Mehrabani, Hua-Peng Chen

Abstract:

Management and maintenance of coastal defence structures during the expected life cycle have become a real challenge for decision makers and engineers. Accurate evaluation of the current condition and future performance of flood defence structures is essential for effective practical maintenance strategies on the basis of available field inspection data. Moreover, as coastal defence structures age, it becomes more challenging to implement maintenance and management plans to avoid structural failure. Therefore, condition inspection data are essential for assessing damage and forecasting deterioration of ageing flood defence structures in order to keep the structures in an acceptable condition. The inspection data for flood defence structures are often collected using discrete visual condition rating schemes. In order to evaluate future condition of the structure, a probabilistic deterioration model needs to be utilised. However, existing deterioration models may not provide a reliable prediction of performance deterioration for a long period due to uncertainties. To tackle the limitation, a time-dependent condition-based model associated with a transition probability needs to be developed on the basis of condition grade scheme for flood defences. This paper presents a probabilistic method for predicting future performance deterioration of coastal flood defence structures based on condition grading inspection data and deterioration curves estimated by expert judgement. In condition-based deterioration modelling, the main task is to estimate transition probability matrices. The deterioration process of the structure related to the transition states is modelled according to Markov chain process, and a reliability-based approach is used to estimate the probability of structural failure. Visual inspection data according to the United Kingdom Condition Assessment Manual are used to obtain the initial condition grade curve of the coastal flood defences. The initial curves then modified in order to develop transition probabilities through non-linear regression based optimisation algorithms. The Monte Carlo simulations are then used to evaluate the future performance of the structure on the basis of the estimated transition probabilities. Finally, a case study is given to demonstrate the applicability of the proposed method under no-maintenance and medium-maintenance scenarios. Results show that the proposed method can provide an effective predictive model for various situations in terms of available condition grading data. The proposed model also provides useful information on time-dependent probability of failure in coastal flood defences.

Keywords: condition grading, flood defense, performance assessment, stochastic deterioration modelling

Procedia PDF Downloads 229
1130 Application of Design Thinking for Technology Transfer of Remotely Piloted Aircraft Systems for the Creative Industry

Authors: V. Santamarina Campos, M. de Miguel Molina, B. de Miguel Molina, M. Á. Carabal Montagud

Abstract:

With this contribution, we want to show a successful example of the application of the Design Thinking methodology, in the European project 'Technology transfer of Remotely Piloted Aircraft Systems (RPAS) for the creative industry'. The use of this methodology has allowed us to design and build a drone, based on the real needs of prospective users. It has demonstrated that this is a powerful tool for generating innovative ideas in the field of robotics, by focusing its effectiveness on understanding and solving real user needs. In this way, with the support of an interdisciplinary team, comprised of creatives, engineers and economists, together with the collaboration of prospective users from three European countries, a non-linear work dynamic has been created. This teamwork has generated a sense of appreciation towards the creative industries, through continuously adaptive, inventive, and playful collaboration and communication, which has facilitated the development of prototypes. These have been designed to enable filming and photography in interior spaces, within 13 sectors of European creative industries: Advertising, Architecture, Fashion, Film, Antiques and Museums, Music, Photography, Televison, Performing Arts, Publishing, Arts and Crafts, Design and Software. Furthermore, it has married the real needs of the creative industries, with what is technologically and commercially viable. As a result, a product of great value has been obtained, which offers new business opportunities for small companies across this sector.

Keywords: design thinking, design for effectiveness, methodology, active toolkit, storyboards, PAR, focus group, innovation, RPAS, indoor drone, aerial film, creative industry, end users, stakeholder

Procedia PDF Downloads 198
1129 Comparison of Methodologies to Compute the Probabilistic Seismic Hazard Involving Faults and Associated Uncertainties

Authors: Aude Gounelle, Gloria Senfaute, Ludivine Saint-Mard, Thomas Chartier

Abstract:

The long-term deformation rates of faults are not fully captured by Probabilistic Seismic Hazard Assessment (PSHA). PSHA that use catalogues to develop area or smoothed-seismicity sources is limited by the data available to constraint future earthquakes activity rates. The integration of faults in PSHA can at least partially address the long-term deformation. However, careful treatment of fault sources is required, particularly, in low strain rate regions, where estimated seismic hazard levels are highly sensitive to assumptions concerning fault geometry, segmentation and slip rate. When integrating faults in PSHA various constraints on earthquake rates from geologic and seismologic data have to be satisfied. For low strain rate regions where such data is scarce it would be especially challenging. Faults in PSHA requires conversion of the geologic and seismologic data into fault geometries, slip rates and then into earthquake activity rates. Several approaches exist for translating slip rates into earthquake activity rates. In the most frequently used approach, the background earthquakes are handled using a truncated approach, in which earthquakes with a magnitude lower or equal to a threshold magnitude (Mw) occur in the background zone, with a rate defined by the rate in the earthquake catalogue. Although magnitudes higher than the threshold are located on the fault with a rate defined using the average slip rate of the fault. As high-lighted by several research, seismic events with magnitudes stronger than the selected magnitude threshold may potentially occur in the background and not only at the fault, especially in regions of slow tectonic deformation. It also has been known that several sections of a fault or several faults could rupture during a single fault-to-fault rupture. It is then essential to apply a consistent modelling procedure to allow for a large set of possible fault-to-fault ruptures to occur aleatory in the hazard model while reflecting the individual slip rate of each section of the fault. In 2019, a tool named SHERIFS (Seismic Hazard and Earthquake Rates in Fault Systems) was published. The tool is using a methodology to calculate the earthquake rates in a fault system where the slip-rate budget of each fault is conversed into rupture rates for all possible single faults and faultto-fault ruptures. The objective of this paper is to compare the SHERIFS method with one other frequently used model to analyse the impact on the seismic hazard and through sensibility studies better understand the influence of key parameters and assumptions. For this application, a simplified but realistic case study was selected, which is in an area of moderate to hight seismicity (South Est of France) and where the fault is supposed to have a low strain.

Keywords: deformation rates, faults, probabilistic seismic hazard, PSHA

Procedia PDF Downloads 58
1128 Vector-Based Analysis in Cognitive Linguistics

Authors: Chuluundorj Begz

Abstract:

This paper presents the dynamic, psycho-cognitive approach to study of human verbal thinking on the basis of typologically different languages /as a Mongolian, English and Russian/. Topological equivalence in verbal communication serves as a basis of Universality of mental structures and therefore deep structures. Mechanism of verbal thinking consisted at the deep level of basic concepts, rules for integration and classification, neural networks of vocabulary. In neuro cognitive study of language, neural architecture and neuro psychological mechanism of verbal cognition are basis of a vector-based modeling. Verbal perception and interpretation of the infinite set of meanings and propositions in mental continuum can be modeled by applying tensor methods. Euclidean and non-Euclidean spaces are applied for a description of human semantic vocabulary and high order structures.

Keywords: Euclidean spaces, isomorphism and homomorphism, mental lexicon, mental mapping, semantic memory, verbal cognition, vector space

Procedia PDF Downloads 518
1127 Agency Beyond Metaphysics of Subjectivity

Authors: Erik Kuravsky

Abstract:

One of the problems with a post-structuralist account of agency is that it appears to reject the freedom of an acting subject, thus seeming to deny the very phenomenon of agency. However, this is only a problem if we think that human beings can be agents exclusively in terms of being subjects, that is, if we think agency subjectively. Indeed, we tend to understand traditional theories of human freedom (e.g., Plato’s or Kant’s) in terms of a peculiar ability of the subject. The paper suggests to de-subjectivize agency with the help of Heidegger’s later thought. To do it, ir argues that classical theories of agency may indeed be interpreted as subject-oriented (sometimes even by their authors), but do not have to be read as such. Namely, the claim is that what makes agency what it is, what is essential in agency, is not its belonginess to a subject, but its ontological configuration. We may say that agency “happens,” and that there is a very specific ontological characteristics to this happening. The argument of the paper is that we can find these characteristic in the classical accounts of agency and that these characteristics are sufficient to distinguish human freedom from other natural phenomena. In particular, it offers to think agency not as one of human characteristics, but as an ontological event in which human beings take part. Namely, agency is a (non-human) characteristic of the different modes in which the experienceable existence of beings is determined by Being. To be an agent then is to participate in such ontological determination. What enables this participation is the ways human beings non-thematically understand the ontological difference. For example, for Plato, one acts freely only if one is led by an idea of the good, while for Kant the imperative for free action is categorial. The agency of an agent is thus dependent on the differentiation between ideas/categories and beings met in experience – one is “free” from contingent sensibility in terms of what is different from it ontologically. In this light, modern dependence on subjectivity is evident in the fact that the ontological difference is thought as belonging to one’s thinking, consciousness etc. That is, it is taken subjectively. A non-subjective account of agency, on the other hand, requires thinking this difference as belonging to Being itself, and thinking human beings as a medium within which occurs the non-human force of ontological differentiation.

Keywords: Heidegger, freedom, agency, poststructuralism

Procedia PDF Downloads 192
1126 Solid Particles Transport and Deposition Prediction in a Turbulent Impinging Jet Using the Lattice Boltzmann Method and a Probabilistic Model on GPU

Authors: Ali Abdul Kadhim, Fue Lien

Abstract:

Solid particle distribution on an impingement surface has been simulated utilizing a graphical processing unit (GPU). In-house computational fluid dynamics (CFD) code has been developed to investigate a 3D turbulent impinging jet using the lattice Boltzmann method (LBM) in conjunction with large eddy simulation (LES) and the multiple relaxation time (MRT) models. This paper proposed an improvement in the LBM-cellular automata (LBM-CA) probabilistic method. In the current model, the fluid flow utilizes the D3Q19 lattice, while the particle model employs the D3Q27 lattice. The particle numbers are defined at the same regular LBM nodes, and transport of particles from one node to its neighboring nodes are determined in accordance with the particle bulk density and velocity by considering all the external forces. The previous models distribute particles at each time step without considering the local velocity and the number of particles at each node. The present model overcomes the deficiencies of the previous LBM-CA models and, therefore, can better capture the dynamic interaction between particles and the surrounding turbulent flow field. Despite the increasing popularity of LBM-MRT-CA model in simulating complex multiphase fluid flows, this approach is still expensive in term of memory size and computational time required to perform 3D simulations. To improve the throughput of each simulation, a single GeForce GTX TITAN X GPU is used in the present work. The CUDA parallel programming platform and the CuRAND library are utilized to form an efficient LBM-CA algorithm. The methodology was first validated against a benchmark test case involving particle deposition on a square cylinder confined in a duct. The flow was unsteady and laminar at Re=200 (Re is the Reynolds number), and simulations were conducted for different Stokes numbers. The present LBM solutions agree well with other results available in the open literature. The GPU code was then used to simulate the particle transport and deposition in a turbulent impinging jet at Re=10,000. The simulations were conducted for L/D=2,4 and 6, where L is the nozzle-to-surface distance and D is the jet diameter. The effect of changing the Stokes number on the particle deposition profile was studied at different L/D ratios. For comparative studies, another in-house serial CPU code was also developed, coupling LBM with the classical Lagrangian particle dispersion model. Agreement between results obtained with LBM-CA and LBM-Lagrangian models and the experimental data is generally good. The present GPU approach achieves a speedup ratio of about 350 against the serial code running on a single CPU.

Keywords: CUDA, GPU parallel programming, LES, lattice Boltzmann method, MRT, multi-phase flow, probabilistic model

Procedia PDF Downloads 204
1125 Learning-by-Heart vs. Learning by Thinking: Fostering Thinking in Foreign Language Learning A Comparison of Two Approaches

Authors: Danijela Vranješ, Nataša Vukajlović

Abstract:

Turning to learner-centered teaching instead of the teacher-centered approach brought a whole new perspective into the process of teaching and learning and set a new goal for improving the educational process itself. However, recently a tremendous decline in students’ performance on various standardized tests can be observed, above all on the PISA-test. The learner-centeredness on its own is not enough anymore: the students’ ability to think is deteriorating. Especially in foreign language learning, one can encounter a lot of learning by heart: whether it is grammar or vocabulary, teachers often seem to judge the students’ success merely on how well they can recall a specific word, phrase, or grammar rule, but they rarely aim to foster their ability to think. Convinced that foreign language teaching can do both, this research aims to discover how two different approaches to teaching foreign language foster the students’ ability to think as well as to what degree they help students get to the state-determined level of foreign language at the end of the semester as defined in the Common European Framework. For this purpose, two different curricula were developed: one is a traditional, learner-centered foreign language curriculum that aims at teaching the four competences as defined in the Common European Framework and serves as a control variable, whereas the second one has been enriched with various thinking routines and aims at teaching the foreign language as a means to communicate ideas and thoughts rather than reducing it to the four competences. Moreover, two types of tests were created for each approach, each based on the content taught during the semester. One aims to test the students’ competences as defined in the CER, and the other aims to test the ability of students to draw on the knowledge gained and come to their own conclusions based on the content taught during the semester. As it is an ongoing study, the results are yet to be interpreted.

Keywords: common european framework of reference, foreign language learning, foreign language teaching, testing and assignment

Procedia PDF Downloads 100
1124 Probabilistic Analysis of Bearing Capacity of Isolated Footing using Monte Carlo Simulation

Authors: Sameer Jung Karki, Gokhan Saygili

Abstract:

The allowable bearing capacity of foundation systems is determined by applying a factor of safety to the ultimate bearing capacity. Conventional ultimate bearing capacity calculations routines are based on deterministic input parameters where the nonuniformity and inhomogeneity of soil and site properties are not accounted for. Hence, the laws of mathematics like probability calculus and statistical analysis cannot be directly applied to foundation engineering. It’s assumed that the Factor of Safety, typically as high as 3.0, incorporates the uncertainty of the input parameters. This factor of safety is estimated based on subjective judgement rather than objective facts. It is an ambiguous term. Hence, a probabilistic analysis of the bearing capacity of an isolated footing on a clayey soil is carried out by using the Monte Carlo Simulation method. This simulated model was compared with the traditional discrete model. It was found out that the bearing capacity of soil was found higher for the simulated model compared with the discrete model. This was verified by doing the sensitivity analysis. As the number of simulations was increased, there was a significant % increase of the bearing capacity compared with discrete bearing capacity. The bearing capacity values obtained by simulation was found to follow a normal distribution. While using the traditional value of Factor of safety 3, the allowable bearing capacity had lower probability (0.03717) of occurring in the field compared to a higher probability (0.15866), while using the simulation derived factor of safety of 1.5. This means the traditional factor of safety is giving us bearing capacity that is less likely occurring/available in the field. This shows the subjective nature of factor of safety, and hence probability method is suggested to address the variability of the input parameters in bearing capacity equations.

Keywords: bearing capacity, factor of safety, isolated footing, montecarlo simulation

Procedia PDF Downloads 183
1123 Risk Issues for Controlling Floods through Unsafe, Dual Purpose, Gated Dams

Authors: Gregory Michael McMahon

Abstract:

Risk management for the purposes of minimizing the damages from the operations of dams has met with opposition emerging from organisations and authorities, and their practitioners. It appears that the cause may be a misunderstanding of risk management arising from exchanges that mix deterministic thinking with risk-centric thinking and that do not separate uncertainty from reliability and accuracy from probability. This paper sets out those misunderstandings that arose from dam operations at Wivenhoe in 2011, using a comparison of outcomes that have been based on the methodology and its rules and those that have been operated by applying misunderstandings of the rules. The paper addresses the performance of one risk-centric Flood Manual for Wivenhoe Dam in achieving a risk management outcome. A mixture of engineering, administrative, and legal factors appear to have combined to reduce the outcomes from the risk approach. These are described. The findings are that a risk-centric Manual may need to assist administrations in the conduct of scenario training regimes, in responding to healthy audit reporting, and in the development of decision-support systems. The principal assistance needed from the Manual, however, is to assist engineering and the law to a good understanding of how risks are managed – do not assume that risk management is understood. The wider findings are that the critical profession for decision-making downstream of the meteorologist is not dam engineering or hydrology, or hydraulics; it is risk management. Risk management will provide the minimum flood damage outcome where actual rainfalls match or exceed forecasts of rainfalls, that therefore risk management will provide the best approach for the likely history of flooding in the life of a dam, and provisions made for worst cases may be state of the art in risk management. The principal conclusion is the need for training in both risk management as a discipline and also in the application of risk management rules to particular dam operational scenarios.

Keywords: risk management, flood control, dam operations, deterministic thinking

Procedia PDF Downloads 81
1122 Islam, Tolerance and Anti-Terrorism: A Critical Assessment with Reference to the Royal 'Amman Message'

Authors: Adnan M. Al Assaf

Abstract:

This research project aims to assess the methods of enhancing tolerant thinking and behavior among Muslim societies. This is in addition to spreading the anti-terrorist approach in their communities. The critical assessment for the Islamic major texts in question is the selected way for convincing, as Muslims adopt these sources as the authentic references for their lives and cultures. Moreover, this research devotes a special room to the analysis of the royal ‘Amman Message’ as a contemporary Islamic approach for enhancing tolerance and anti-terrorism from an Islamic perspective. The paper includes the study of the related concepts, texts, practical applications, with some reference to the history of Islam in human interaction, accepting the others, mercy with minorities, protecting human rights. Furthermore, it assesses the methods of enhancing tolerance and minimizing the terrorist thinking and behavior practically, in the view of Amman message, as well.

Keywords: Islam, tolerance, anti-terrorism, coexistence, Amman Message

Procedia PDF Downloads 455
1121 Designing User Interfaces for Just in Time Enterprise Solution

Authors: Romi Dey

Abstract:

Introduction: One of the most important criteria for technology to sustain and grow is through it’s elaborate and intuitive design methodology and design thinking. Designing for enterprise applications that cater to Just in Time Technology is one of the most challenging and detailed processes any User Experience Designer would come across. Description: The basic principles of Design, when applied to tailor to these technologies, creates an immense challenge and that’s how a set of redefined and revised design principles that can be applied to designing any Just In Time manufacturing solution. Findings: The thorough process of understanding the end user, their existing pain points which they’ve faced in the real world, their responsibilities and expectations, the core needs and last but not the least the demands, creates havoc nurturing of the design methodologies for the Just in Time solutions. With respect to the business aspect, design and design principles play a strong role in any form of innovation. Conclusion: Innovation and knowledge about the latest technologies are the keywords in the manufacturing industry. It becomes crucial for the product development team to be precise in their understanding of the technology and being sure of end users expectation.

Keywords: design thinking, enterprise application, Just in Time, user experience design

Procedia PDF Downloads 165
1120 Exploring the Impact of ChatGPT on the English Writing Skills of a Group of International EFL Uzbek Students: A Qualitative Case Study Conducted at a Private University College in Malaysia

Authors: Uranus Saadat

Abstract:

ChatGPT, as one of the well-known artificial intelligence (AI) tools, has recently been integrated into English language education and has had several impacts on learners. Accordingly, concerns regarding the overuse of this tool among EFL/ESL learners are rising, which could lead to several disadvantages in their writing skills development. The use of ChatGPT in facilitating writing skills is a novel concept that demands further studies in different contexts and learners. In this study, a qualitative case study is applied to investigate the impact of ChatGPT on the writing skills of a group of EFL bachelor’s students from Uzbekistan studying Teaching English as the Second Language (TESL) at a private university in Malaysia. The data was collected through the triangulation of document analysis, semi-structured interviews, classroom observations, and focus group discussions. Subsequently, the data was analyzed by using thematic analysis. Some of the emerging themes indicated that ChatGPT is helpful in engaging students by reducing their anxiety in class and providing them with constructive feedback and support. Conversely, certain emerging themes revealed excessive reliance on ChatGPT, resulting in a decrease in students’ creativity and critical thinking skills, memory span, and tolerance for ambiguity. The study suggests a number of strategies to alleviate its negative impacts, such as peer review activities, workshops for familiarizing students with AI, and gradual withdrawal of AI support activities. This study emphasizes the need for cautious AI integration into English language education to cultivate independent learners with higher-order thinking skills.

Keywords: ChatGPT, EFL/ESL learners, English writing skills, artificial intelligence tools, critical thinking skills

Procedia PDF Downloads 5
1119 Seismicity and Ground Response Analysis for MP Tourism Office in Indore, India

Authors: Deepshikha Shukla, C. H. Solanki, Mayank Desai

Abstract:

In the last few years, it has been observed that earthquake is proving a threat to the scientist across the world. With a large number of earthquakes occurring in day to day life, the threat to life and property has increased manifolds which call for an urgent attention of all the researchers globally to carry out the research in the field of Earthquake Engineering. Any hazard related to the earthquake and seismicity is considered to be seismic hazards. The common forms of seismic hazards are Ground Shaking, Structure Damage, Structural Hazards, Liquefaction, Landslides, Tsunami to name a few. Among all the natural hazards, the most devastating and damaging is the earthquake as all other hazards are triggered only after the occurrence of an earthquake. In order to quantify and estimate the seismicity and seismic hazards, many methods and approaches have been proposed in the past few years. Such approaches are Mathematical, Conventional and Computational. Convex Set Theory, Empirical Green’s Function are some of the Mathematical Approaches whereas the Deterministic and Probabilistic Approaches are the Conventional Approach for the estimation of the seismic Hazards. Ground response and Ground Shaking of a particular area or region plays an important role in the damage caused due to the earthquake. In this paper, seismic study using Deterministic Approach and 1 D Ground Response Analysis has been carried out for Madhya Pradesh Tourism Office in Indore Region in Madhya Pradesh in Central India. Indore lies in the seismic zone III (IS: 1893, 2002) in the Seismic Zoning map of India. There are various faults and lineament in this area and Narmada Some Fault and Gavilgadh fault are the active sources of earthquake in the study area. Deepsoil v6.1.7 has been used to perform the 1 D Linear Ground Response Analysis for the study area. The Peak Ground Acceleration (PGA) of the city ranges from 0.1g to 0.56g.

Keywords: seismicity, seismic hazards, deterministic, probabilistic methods, ground response analysis

Procedia PDF Downloads 161
1118 Creative Thinking through Mindful Practices: A Business Class Case Study

Authors: Malavika Sundararajan

Abstract:

This study introduces the use of mindfulness techniques in the classroom to make individuals aware of how the creative thinking process works, resulting in more constructive learning and application. Case observation method was utilized within a classroom setting in a graduate class in the Business School. It entailed, briefing the student participants about the use of a template called the dots and depths map, and having them complete it for themselves, compare it to their team members and reflect on the outputs. Finally, they were debriefed about the use of the template and its value to their learning and creative application process. The major finding is the increase in awareness levels of the participants following the use of the template, leading to a subsequent pursuit of diverse knowledge and acquisition of relevant information and not jumping to solutions directly, which increased their overall creative outputs for the given assignment. The significant value of this study is that it can be applied to any classroom on any subject as a powerful mindfulness tool which increases creative problem solving through constructive knowledge building.

Keywords: connecting dots, mindful awareness, constructive knowledge building, learning creatively

Procedia PDF Downloads 143
1117 High-Resolution Flood Hazard Mapping Using Two-Dimensional Hydrodynamic Model Anuga: Case Study of Jakarta, Indonesia

Authors: Hengki Eko Putra, Dennish Ari Putro, Tri Wahyu Hadi, Edi Riawan, Junnaedhi Dewa Gede, Aditia Rojali, Fariza Dian Prasetyo, Yudhistira Satya Pribadi, Dita Fatria Andarini, Mila Khaerunisa, Raditya Hanung Prakoswa

Abstract:

Catastrophe risk management can only be done if we are able to calculate the exposed risks. Jakarta is an important city economically, socially, and politically and in the same time exposed to severe floods. On the other hand, flood risk calculation is still very limited in the area. This study has calculated the risk of flooding for Jakarta using 2-Dimensional Model ANUGA. 2-Dimensional model ANUGA and 1-Dimensional Model HEC-RAS are used to calculate the risk of flooding from 13 major rivers in Jakarta. ANUGA can simulate physical and dynamical processes between the streamflow against river geometry and land cover to produce a 1-meter resolution inundation map. The value of streamflow as an input for the model obtained from hydrological analysis on rainfall data using hydrologic model HEC-HMS. The probabilistic streamflow derived from probabilistic rainfall using statistical distribution Log-Pearson III, Normal and Gumbel, through compatibility test using Chi Square and Smirnov-Kolmogorov. Flood event on 2007 is used as a comparison to evaluate the accuracy of model output. Property damage estimations were calculated based on flood depth for 1, 5, 10, 25, 50, and 100 years return period against housing value data from the BPS-Statistics Indonesia, Centre for Research and Development of Housing and Settlements, Ministry of Public Work Indonesia. The vulnerability factor was derived from flood insurance claim. Jakarta's flood loss estimation for the return period of 1, 5, 10, 25, 50, and 100 years, respectively are Rp 1.30 t; Rp 16.18 t; Rp 16.85 t; Rp 21.21 t; Rp 24.32 t; and Rp 24.67 t of the total value of building Rp 434.43 t.

Keywords: 2D hydrodynamic model, ANUGA, flood, flood modeling

Procedia PDF Downloads 272
1116 Teaching Computer Programming to Diverse Students: A Comparative, Mixed-Methods, Classroom Research Study

Authors: Almudena Konrad, Tomás Galguera

Abstract:

Lack of motivation and interest is a serious obstacle to students’ learning computing skills. A need exists for a knowledge base on effective pedagogy and curricula to teach computer programming. This paper presents results from research evaluating a six-year project designed to teach complex concepts in computer programming collaboratively, while supporting students to continue developing their computer thinking and related coding skills individually. Utilizing a quasi-experimental, mixed methods design, the pedagogical approaches and methods were assessed in two contrasting groups of students with different socioeconomic status, gender, and age composition. Analyses of quantitative data from Likert-scale surveys and an evaluation rubric, combined with qualitative data from reflective writing exercises and semi-structured interviews yielded convincing evidence of the project’s success at both teaching and inspiring students.

Keywords: computational thinking, computing education, computer programming curriculum, logic, teaching methods

Procedia PDF Downloads 311
1115 Positive Disruption: Towards a Definition of Artist-in-Residence Impact on Organisational Creativity

Authors: Denise Bianco

Abstract:

Several studies on innovation and creativity in organisations emphasise the need to expand horizons and take on alternative and unexpected views to produce something new. This paper theorises the potential impact artists can have as creative catalysts, working embedded in non-artistic organisations. It begins from an understanding that in today's ever-changing scenario, organisations are increasingly seeking to open up new creative thinking through deviant behaviours to produce innovation and that art residencies need to be critically revised in this specific context in light of their disruptive potential. On the one hand, this paper builds upon recent contributions made on workplace creativity and related concepts of deviance and disruption. Research suggests that creativity is likely to be lower in work contexts where utter conformity is a cardinal value and higher in work contexts that show some tolerance for uncertainty and deviance. On the other hand, this paper draws attention to Artist-in-Residence as a vehicle for epistemic friction between divergent and convergent thinking, which allows the creation of unparalleled ways of knowing in the dailiness of situated and contextualised social processes. In order to do so, this contribution brings together insights from the most relevant theories on organisational creativity and unconventional agile methods such as Art Thinking and direct insights from ethnographic fieldwork in the context of embedded art residencies within work organisations to propose a redefinition of Artist-in-Residence and their potential impact on organisational creativity. The result is a re-definition of embedded Artist-in-Residence in organisational settings from a more comprehensive, multi-disciplinary, and relational perspective that builds on three focal points. First the notion that organisational creativity is a dynamic and synergistic process throughout which an idea is framed by recurrent activities subjected to multiple influences. Second, the definition of embedded Artist-in-Residence as an assemblage of dynamic, productive relations and unexpected possibilities for new networks of relationality that encourage the recombination of knowledge. Third, and most importantly, the acknowledgment that embedded residencies are, at the very essence, bi-cultural knowledge contexts where creativity flourishes as the result of open-to-change processes that are highly relational, constantly negotiated, and contextualised in time and space.

Keywords: artist-in-residence, convergent and divergent thinking, creativity, creative friction, deviance and creativity

Procedia PDF Downloads 94
1114 Thinking Historiographically in the 21st Century: The Case of Spanish Musicology, a History of Music without History

Authors: Carmen Noheda

Abstract:

This text provides a reflection on the way of thinking about the study of the history of music by examining the production of historiography in Spain at the turn of the century. Based on concepts developed by the historical theorist Jörn Rüsen, the article focuses on the following aspects: the theoretical artifacts that structure the interpretation of the limits of writing the history of music, the narrative patterns used to give meaning to the discourse of history, and the orientation context that functions as a source of criteria of significance for both interpretation and representation. This analysis intends to show that historical music theory is not only a means to abstractly explore the complex questions connected to the production of historical knowledge, but also a tool for obtaining concrete images about the intellectual practice of professional musicologists. Writing about the historiography of contemporary Spanish music is a task that requires both a knowledge of the history that is being written and investigated, as well as a familiarity with current theoretical trends and methodologies that allow for the recognition and definition of the different tendencies that have arisen in recent decades. With the objective of carrying out these premises, this project takes as its point of departure the 'immediate historiography' in relation to Spanish music at the beginning of the 21st century. The hesitation that Spanish musicology has shown in opening itself to new anthropological and sociological approaches, along with its rigidity in the face of the multiple shifts in dynamic forms of thinking about history, have produced a standstill whose consequences can be seen in the delayed reception of the historiographical revolutions that have emerged in the last century. Methodologically, this essay is underpinned by Rüsen’s notion of the disciplinary matrix, which is an important contribution to the understanding of historiography. Combined with his parallel conception of differing paradigms of historiography, it is useful for analyzing the present-day forms of thinking about the history of music. Following these theories, the article will in the first place address the characteristics and identification of present historiographical currents in Spanish musicology to thereby carry out an analysis based on the theories of Rüsen. Finally, it will establish some considerations for the future of musical historiography, whose atrophy has not only fostered the maintenance of an ingrained positivist tradition, but has also implied, in the case of Spain, an absence of methodological schools and an insufficient participation in international theoretical debates. An update of fundamental concepts has become necessary in order to understand that thinking historically about music demands that we remember that subjects are always linked by reciprocal interdependencies that structure and define what it is possible to create. In this sense, the fundamental aim of this research departs from the recognition that the history of music is embedded in the conditions that make it conceivable, communicable and comprehensible within a society.

Keywords: historiography, Jörn Rüssen, Spanish musicology, theory of history of music

Procedia PDF Downloads 187
1113 Designing for Wearable Interactions: Exploring Care Design for Design Anthropology and Participatory Design

Authors: Wei-Chen Chang, Yu-Cheng Pei

Abstract:

This research examines wearable interaction design to mediate the design anthropology and participatory design found in technology and fashion. We will discuss the principles of design anthropology and participatory design using a wearable and fashion product process to transmit the ‘people-situation-reason-object’ method and analyze five sense applied examples that provide new thinking for designers engaged in future industry. Design anthropology and Participatory Design attempt to engage physiological and psychological design through technology-function, meaning-form and fashion aesthetics to achieve cognition between user and environment. The wearable interaction provides technological characteristics and semantic ideas transmitted to craft-cultural, collective, cheerful and creative performance. It is more confident and innovative attempt, that is able to achieve a joyful, fundamental interface. This study takes two directions for cultural thinking as the basis to establish a set of life-craft designs with interactive experience objects by users that assist designers in examining the sensual feelings to initiate a new lifestyle value.

Keywords: design anthropology, wearable design, design communication, participatory design

Procedia PDF Downloads 231
1112 Infusion of Skills for Undergraduate Scholarship into Teacher Education: Two Case Studies in New York and Florida

Authors: Tunde Szecsi, Janka Szilagyi

Abstract:

Students majoring in education are underrepresented in undergraduate scholarship. To enable and encourage teacher candidates to engage in scholarly activities, it is essential to infuse skills such as problem-solving, critical thinking, oral and written communication, collaboration and the utilization of information literacy, into courses in teacher preparation programs. In this empirical study, we examined two teacher education programs – one in New York State and one in Florida – in terms of the approaches of the course-based infusion of skills for undergraduate research, and the effectiveness of this infusion. First, course-related documents such as syllabi, assignment descriptions, and course activities were reviewed and analyzed. The goal of the document analysis was to identify and describe the targeted skills, and the pedagogical approaches and strategies for promoting research skills in teacher candidates. Next, a selection of teacher candidates’ scholarly products from the institution in Florida was used as a data set to examine teacher candidates’ skill development in the context of the identified assignments. This dataset was analyzed both quantitatively and qualitatively to describe the changes that occurred in teacher candidates’ critical thinking, communication, and information literacy skills, and to uncover patterns in the skill development at the two institutions. Descriptive statistics were calculated to explore the changes in these skills of teacher candidates over a period of three years. The findings based on data from the teacher education program in Florida indicated a steady gain in written communication and critical thinking and a modest increase in informational literacy. At the institution in New York, candidates’ submission and success rates on the edTPA, a New York State Teacher Certification exam, was used as a measure of scholarly skills. Overall, although different approaches were used for infusing the development of scholarly skills in the courses, the results suggest that a holistic and well-orchestrated infusion of the skills into most courses in the teacher education program might result in steadily developing scholarly skills. These results offered essential implications for teacher education programs in terms of further improvements in teacher candidates’ skills for engaging in undergraduate research and scholarship. In this presentation, our purpose is to showcase two approaches developed by two teacher education programs to demonstrate how diverse approaches toward the promotion of undergraduate scholarship activities are responsive to the context of the teacher preparation programs.

Keywords: critical thinking, pedagogical strategies, teacher education, undergraduate student research

Procedia PDF Downloads 157
1111 Doing Cause-and-Effect Analysis Using an Innovative Chat-Based Focus Group Method

Authors: Timothy Whitehill

Abstract:

This paper presents an innovative chat-based focus group method for collecting qualitative data to construct a cause-and-effect analysis in business research. This method was developed in response to the research and data collection challenges faced by the Covid-19 outbreak in the United Kingdom during 2020-21. This paper discusses the methodological approaches and builds a contemporary argument for its effectiveness in exploring cause-and-effect relationships in the context of focus group research, systems thinking and problem structuring methods. The pilot for this method was conducted between October 2020 and March 2021 and collected more than 7,000 words of chat-based data which was used to construct a consensus drawn cause-and-effect analysis. This method was developed in support of an ongoing Doctorate in Business Administration (DBA) thesis, which is using Design Science Research methodology to operationalize organisational resilience in UK construction sector firms.

Keywords: cause-and-effect analysis, focus group research, problem structuring methods, qualitative research, systems thinking

Procedia PDF Downloads 216
1110 STEAM and Project-Based Learning: Equipping Young Women with 21st Century Skills

Authors: Sonia Saddiqui, Maya Marcus

Abstract:

UTS STEAMpunk Girls is an educational program for young women (aged 12-16), to empower them to be more informed and active members of the 21st century workforce. With the number of STEM graduates on the decline, especially among young women, an additional aim of the program is to trial a STEAM (Science, Technology, Engineering, Arts/Humanities/Social Sciences, Mathematics), inter-disciplinary approach to improving STEM engagement. In-line with UNESCO’s recent focus on promoting ‘transversal competencies’ in future graduates, the program utilised co-design, project-based learning, entrepreneurial processes, and inter-disciplinary learning. The program consists of two phases. Taking a participatory design approach, the first phase (co-design workshops) provided valuable insight into student perspectives around engaging young women in STEM and inter-disciplinary thinking. The workshops positioned 26 young women from three schools as subject matter experts (SMEs), providing a platform for them to share their opinions, experiences and findings around the STEAM disciplines. The second (pilot) phase put the co-design phase findings into practice, with 64 students from four schools working in groups to articulate problems with real-world implications, and utilising design-thinking to solve them. The pilot phase utilised project-based learning to engage young women in entrepreneurial and STEAM frameworks and processes. Scalable program design and educational resources were trialed to determine appropriate mechanisms for engaging young women in STEM and in STEAM thinking. Across both phases, data was collected via longitudinal surveys to obtain pre-program, baseline attitudinal information, and compare that against post-program responses. Preliminary findings revealed students’ improved understanding of the STEM disciplines, industries and professions, improved awareness of STEAM as a concept, and improved understanding regarding inter-disciplinary and design thinking. Program outcomes will be of interest to high-school educators in both STEM and the Arts, Humanities and Social Sciences fields, and will hopefully inform future programmatic approaches to introducing inter-disciplinary STEAM learning in STEM curriculum.

Keywords: co-design, STEM, STEAM, project-based learning, inter-disciplinary

Procedia PDF Downloads 195
1109 Using Action Research to Digitize Theses and Journal Articles at the Main Library, Sultan Qaboos University, Oman

Authors: Nabhan H. N. Al-Harrasi

Abstract:

Action Research (AR) plays an important role in improving the problematical situation. It is a process that enhances thinking and practise and bridges the gap between abstract and concrete thinking. Nowadays, AR as a methodology is wildly used to implement projects based on understanding the needs of owners, considering the organizational culture, meeting the requirements, encouraging partnership, representing different viewpoints, and building the project. This research describes the whole processes of digitizing Post-graduate theses and all articles published in 6 Journals at Sultan Qaboos University. AR implemented to respond to the university needs to enhance accessibilities to its information resources and make them available through the national repository. In order to prepare the action plan, the library administration met to discuss several points related to the proposed project, the most important of which are: • Providing digitalization devices. • Locating a specific part of the Library as a Digitization Unit. • Choosing a team. • Defining tasks. • Implementing the proposed project and evaluating the whole processes.

Keywords: action research, digitization, Theses, Journal articles, open access, Oman

Procedia PDF Downloads 175
1108 Role of Spatial Variability in the Service Life Prediction of Reinforced Concrete Bridges Affected by Corrosion

Authors: Omran M. Kenshel, Alan J. O'Connor

Abstract:

Estimating the service life of Reinforced Concrete (RC) bridge structures located in corrosive marine environments of a great importance to their owners/engineers. Traditionally, bridge owners/engineers relied more on subjective engineering judgment, e.g. visual inspection, in their estimation approach. However, because financial resources are often limited, rational calculation methods of estimation are needed to aid in making reliable and more accurate predictions for the service life of RC structures. This is in order to direct funds to bridges found to be the most critical. Criticality of the structure can be considered either form the Structural Capacity (i.e. Ultimate Limit State) or from Serviceability viewpoint whichever is adopted. This paper considers the service life of the structure only from the Structural Capacity viewpoint. Considering the great variability associated with the parameters involved in the estimation process, the probabilistic approach is most suited. The probabilistic modelling adopted here used Monte Carlo simulation technique to estimate the Reliability (i.e. Probability of Failure) of the structure under consideration. In this paper the authors used their own experimental data for the Correlation Length (CL) for the most important deterioration parameters. The CL is a parameter of the Correlation Function (CF) by which the spatial fluctuation of a certain deterioration parameter is described. The CL data used here were produced by analyzing 45 chloride profiles obtained from a 30 years old RC bridge located in a marine environment. The service life of the structure were predicted in terms of the load carrying capacity of an RC bridge beam girder. The analysis showed that the influence of SV is only evident if the reliability of the structure is governed by the Flexure failure rather than by the Shear failure.

Keywords: Chloride-induced corrosion, Monte-Carlo simulation, reinforced concrete, spatial variability

Procedia PDF Downloads 470
1107 New Tools and New Ways; Changing the Nature of Leadership and Future Challenges

Authors: Harun Ozdemirci

Abstract:

Complexity and chaos are the characteristics of our new world today. Either business or governmental sector, inner and outer environment changes in all aspects. To ensure leaders to guide organizations accurately and effectively, leaders also must change their attitudes towards this changing world . We need new tools, new mindsets and new views for new century. Every leader have to operate within an cerative and innovative way of thinking. But how it will occur and at which direction it will be managed or directed? What kind of abilities and attitudes make leader compatible with this ever-changing and ambigous environment? Leader who will lead in the future must have some special skillls. But how can we develop these skills and behaviours? What must be the mindset of a future leader? This paper searchs for answers of some of these questions. But asking questions is more important than giving answers to them. Innovation and creativity have been at the centerpiece of our lives for some years. But we don’t know how to manage and how to tackle with the challenges come up with this new situation. This new world order compel us to take some new positions against new employees who have different types of lives and habits, new productivity processes, new adversaries… Future environment will not be the same as we experience before. So, our responses to this new environment can not be the same as our predecessors gave. We have to innovate new ways of thinking, and new tools for solving new type of problems.

Keywords: innovation, creativity, leader, future, liberal arts

Procedia PDF Downloads 269
1106 Interior Design Pedagogy in the 21st Century: Personalised Design Process

Authors: Roba Zakariah Shaheen

Abstract:

In the 21st-century Interior, design pedagogy has developed rapidly due to social and economical factors. Socially, this paper presents research findings that shows a significant relationship between educators and students in interior design education. It shows that students’ personal traits, design process, and thinking process are significantly interrelated. Constructively, this paper presented how personal traits can guide educators in the interior design education domain to develop students’ thinking process. In the same time, it demonstrated how students should use their own personal traits to create their own design process. Constructivism was the theory underneath this research, as it supports the grounded theory, which is the methodological approach of this research. Moreover, Mayer’s Briggs Type Indicator strategy was used to investigate the personality traits scientifically, as a psychological strategy that related to cognitive ability. Conclusions from this research strongly recommends that educators and students should utilize their personal traits to foster interior design education.

Keywords: interior design, pedagogy, constructivism, grounded theory, personality traits, creativity

Procedia PDF Downloads 200