Search results for: computational thinking
1141 Computer Aided Diagnostic System for Detection and Classification of a Brain Tumor through MRI Using Level Set Based Segmentation Technique and ANN Classifier
Authors: Atanu K Samanta, Asim Ali Khan
Abstract:
Due to the acquisition of huge amounts of brain tumor magnetic resonance images (MRI) in clinics, it is very difficult for radiologists to manually interpret and segment these images within a reasonable span of time. Computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of radiologists and reduce the time required for accurate diagnosis. An intelligent computer-aided technique for automatic detection of a brain tumor through MRI is presented in this paper. The technique uses the following computational methods; the Level Set for segmentation of a brain tumor from other brain parts, extraction of features from this segmented tumor portion using gray level co-occurrence Matrix (GLCM), and the Artificial Neural Network (ANN) to classify brain tumor images according to their respective types. The entire work is carried out on 50 images having five types of brain tumor. The overall classification accuracy using this method is found to be 98% which is significantly good.Keywords: brain tumor, computer-aided diagnostic (CAD) system, gray-level co-occurrence matrix (GLCM), tumor segmentation, level set method
Procedia PDF Downloads 5121140 Bone Fracture Detection with X-Ray Images Using Mobilenet V3 Architecture
Authors: Ashlesha Khanapure, Harsh Kashyap, Abhinav Anand, Sanjana Habib, Anupama Bidargaddi
Abstract:
Technologies that are developing quickly are being developed daily in a variety of disciplines, particularly the medical field. For the purpose of detecting bone fractures in X-ray pictures of different body segments, our work compares the ResNet-50 and MobileNetV3 architectures. It evaluates accuracy and computing efficiency with X-rays of the elbow, hand, and shoulder from the MURA dataset. Through training and validation, the models are evaluated on normal and fractured images. While ResNet-50 showcases superior accuracy in fracture identification, MobileNetV3 showcases superior speed and resource optimization. Despite ResNet-50’s accuracy, MobileNetV3’s swifter inference makes it a viable choice for real-time clinical applications, emphasizing the importance of balancing computational efficiency and accuracy in medical imaging. We created a graphical user interface (GUI) for MobileNet V3 model bone fracture detection. This research underscores MobileNetV3’s potential to streamline bone fracture diagnoses, potentially revolutionizing orthopedic medical procedures and enhancing patient care.Keywords: CNN, MobileNet V3, ResNet-50, healthcare, MURA, X-ray, fracture detection
Procedia PDF Downloads 631139 Simulation of the Asphaltene Deposition Rate in a Wellbore Blockage via Computational Fluid Dynamic
Authors: Xiaodong Gao, Pingchuan Dong, Qichao Gao
Abstract:
There has been lots of published work focused on asphaltene deposited on the smooth pipe under steady conditions, while particle deposition on the blockage wellbores under transient conditions has not been well elucidated. This work attempts to predict the deposition rate of asphaltene particles in blockage tube through CFD simulation. The Euler-Lagrange equation has been applied during the flow of crude oil and asphaltene particles. The net gravitational force, virtual mass, pressure gradient, saffman lift, and drag forces are incorporated in the simulations process. Validation of CFD simulation results is compared to the benchmark experiments from the previous literature. Furthermore, the effect of blockage location, blockage length, and blockage thickness on deposition rate are also analyzed. The simulation results indicate that the maximum deposition rate of asphaltene occurs in the blocked tube section, and the greater the deposition thickness, the greater the deposition rate. Moreover, the deposition amount and maximum deposition rate along the length of the tube have the same trend. Results of this study are in the ability to better understand the deposition of asphaltene particles in production and help achieve to deal with the asphaltene challenges.Keywords: asphaltene deposition rate, blockage length, blockage thickness, blockage diameter, transient condition
Procedia PDF Downloads 2011138 A Genetic Algorithm for the Load Balance of Parallel Computational Fluid Dynamics Computation with Multi-Block Structured Mesh
Authors: Chunye Gong, Ming Tie, Jie Liu, Weimin Bao, Xinbiao Gan, Shengguo Li, Bo Yang, Xuguang Chen, Tiaojie Xiao, Yang Sun
Abstract:
Large-scale CFD simulation relies on high-performance parallel computing, and the load balance is the key role which affects the parallel efficiency. This paper focuses on the load-balancing problem of parallel CFD simulation with structured mesh. A mathematical model for this load-balancing problem is presented. The genetic algorithm, fitness computing, two-level code are designed. Optimal selector, robust operator, and local optimization operator are designed. The properties of the presented genetic algorithm are discussed in-depth. The effects of optimal selector, robust operator, and local optimization operator are proved by experiments. The experimental results of different test sets, DLR-F4, and aircraft design applications show the presented load-balancing algorithm is robust, quickly converged, and is useful in real engineering problems.Keywords: genetic algorithm, load-balancing algorithm, optimal variation, local optimization
Procedia PDF Downloads 1851137 Efficient Tuning Parameter Selection by Cross-Validated Score in High Dimensional Models
Authors: Yoonsuh Jung
Abstract:
As DNA microarray data contain relatively small sample size compared to the number of genes, high dimensional models are often employed. In high dimensional models, the selection of tuning parameter (or, penalty parameter) is often one of the crucial parts of the modeling. Cross-validation is one of the most common methods for the tuning parameter selection, which selects a parameter value with the smallest cross-validated score. However, selecting a single value as an "optimal" value for the parameter can be very unstable due to the sampling variation since the sample sizes of microarray data are often small. Our approach is to choose multiple candidates of tuning parameter first, then average the candidates with different weights depending on their performance. The additional step of estimating the weights and averaging the candidates rarely increase the computational cost, while it can considerably improve the traditional cross-validation. We show that the selected value from the suggested methods often lead to stable parameter selection as well as improved detection of significant genetic variables compared to the tradition cross-validation via real data and simulated data sets.Keywords: cross validation, parameter averaging, parameter selection, regularization parameter search
Procedia PDF Downloads 4151136 Causes, Consequences, and Alternative Strategies of Illegal Migration in Ethiopia: The Case of Tigray Region
Authors: Muuz Abraha Meshesha
Abstract:
Illegal Migration, specifically Trafficking in person is one of the primary issues of the day affecting all states of the world with variation on the extent of the root causes and consequences that led people to migrate irregularly and the consequences it is costing on humanity. This paper intends to investigate the root causes and consequences of illegal migration in Ethiopia’s Tigray Regional state and come up with alternative intervening strategy. To come up with pertinent and robust research finding, this study employed mixed research approach involving qualitative and quantitative data in line with purposive and snow ball sampling selection technique. The study revealed that, though poverty is the most commonly sensed pushing factor for people to illegally migrate, the issue of psycho-social orientation and attitudinal immersion of the local community for illegal migration, both in thinking and action is the most pressing problem that urges serious intervention. Trafficking in persons and Illegal migration in general, is becoming the norm of the day in the study area that overtly reveal illegal migration is an issue beyond livelihood securing demand in practice. Basically, parties engaged in illegal migration and the accomplice with human traffickers these days in the study area are found to be more than urgency for food security and a need to escape from livelihood impoverishment. Therefore, this study come up with a new paradigm insight indicating that illegal migration is believed by the local community members as an optional path way of doing business in illegal way while the attitude of the community and officials authorized to regulate is being part of the channel or to the least tolerant of this grave global danger. The study also found that the effect of illegal migration is significantly manifested in long run than in short term periods. Therefore, a need for critical consideration on attitudinal based intervention and youth oriented and enforceable legal and policy framework accountability framework is required to face and control illegal migration by international, national, local stakeholders. Besides this, economy based development interventions that could engage and reorient the youth, as primary victims of trafficking, and expansion of large scale projects that can employ large number of youths at a time.Keywords: human traficking, illegal migration, migration, tigray region
Procedia PDF Downloads 651135 Assessing the Social Impacts of a Circular Economy in the Global South
Authors: Dolores Sucozhañay, Gustavo Pacheco, Paul Vanegas
Abstract:
In the context of sustainable development and the transition towards a sustainable circular economy (CE), evaluating the social dimension remains a challenge. Therefore, developing a respective methodology is highly important. First, the change of the economic model may cause significant social effects, which today remain unaddressed. Second, following the current level of globalization, CE implementation requires targeting global material cycles and causes social impacts on potentially vulnerable social groups. A promising methodology is the Social Life Cycle Assessment (SLCA), which embraces the philosophy of life cycle thinking and provides complementary information to environmental and economic assessments. In this context, the present work uses the updated Social Life Cycle Assessment (SLCA) Guidelines 2020 to assess the social performance of the recycling system of Cuenca, Ecuador, to exemplify a social assessment method. Like many other developing countries, Ecuador heavily depends on the work of informal waste pickers (recyclers), who, even contributing to a CE, face harsh socio-economic circumstances, including inappropriate working conditions, social exclusion, exploitation, etc. Under a Reference Scale approach (Type 1), 12 impact subcategories were assessed through 73 site-specific inventory indicators, using an ascending reference scale ranging from -2 to +2. Findings reveal a social performance below compliance levels with local and international laws, basic societal expectations, and practices in the recycling sector; only eight and five indicators present a positive score. In addition, a social hotspot analysis depicts collection as the most time-consuming lifecycle stage and the one with the most hotspots, mainly related to working hours and health and safety aspects. This study provides an integrated view of the recyclers’ contributions, challenges, and opportunities within the recycling system while highlighting the relevance of assessing the social dimension of CE practices. It also fosters an understanding of the social impact of CE operations in developing countries, highlights the need for a close north-south relationship in CE, and enables the connection among the environmental, economic, and social dimensions.Keywords: SLCA, circular economy, recycling, social impact assessment
Procedia PDF Downloads 1511134 An Inexhaustible Will of Infinite, or the Creative Will in the Psychophysiological Artistic Practice: An Analysis through Nietzsche's Will to Power
Authors: Filipa Cruz, Grecia P. Matos
Abstract:
An Inexhaustible Will of Infinite is ongoing practice-based research focused on a psychophysiological conception of body and on the creative will that seeks to examine the possibility of art being simultaneously a pacifier and an intensifier in a physiological artistic production. This is a study where philosophy and art converge in a commentary on the affection of the concept of will to power in the art world through Nietzsche’s commentaries, through the analysis of case studies and a reflection arising from artistic practice. Through Nietzsche, it is sought to compare concepts that communicate with the artistic practice since creation is an intensification and engenders perspectives. It is also a practice highly embedded in the body, in the non-verbal, in the physiology of art and in the coexistence between the sensorial and the thought. It is questioned if the physiology of art could be thought of as a thinking-feeling with no primacy of the thought over the sensorial. Art as a manifestation of the will to power participates in a comprehension of the world. In this article, art is taken as a privileged way of communication – implicating corporeal-sensorial-conceptual – and of connection between humans. Problematized is the dream and the drunkenness as intensifications and expressions of life’s comprehension. Therefore, art is perceived as suggestion and invention, where the artistic intoxication breaks limits in the experience of life, and the artist, dominated by creative forces, claims, orders, obeys, proclaims love for life. The intention is also to consider how one can start from pain to create and how one can generate new and endless artistic forms through nightmares, daydreams, impulses, intoxication, enhancement, intensification in a plurality of subjects and matters. It is taken into consideration the fact that artistic creation is something that is intensified corporeally, expanded, continuously generated and acting on bodies. It is inextinguishable and a constant movement intertwining Apollonian and Dionysian instincts of destruction and creation of new forms. The concept of love also appears associated with conquering, that, in a process of intensification and drunkenness, impels the artist to generate and to transform matter. Just like a love relationship, love in Nietzsche requires time, patience, effort, courage, conquest, seduction, obedience, and command, potentiating the amplification of knowledge of the other / the world. Interlacing Nietzsche's philosophy, not with Modern Art, but with Contemporary Art, it is argued that intoxication, will to power (strongly connected with the creative will) and love still have a place in the artistic production as creative agents.Keywords: artistic creation, body, intensification, psychophysiology, will to power
Procedia PDF Downloads 1191133 Learning Mathematics Online: Characterizing the Contribution of Online Learning Environment’s Components to the Development of Mathematical Knowledge and Learning Skills
Authors: Atara Shriki, Ilana Lavy
Abstract:
Teaching for the first time an online course dealing with the history of mathematics, we were struggling with questions related to the design of a proper learning environment (LE). Thirteen high school mathematics teachers, M.Ed. students, attended the course. The teachers were engaged in independent reading of mathematical texts, a task that is recognized as complex due to the unique characteristics of such texts. In order to support the learning processes and develop skills that are essential for succeeding in learning online (e.g. self-regulated learning skills, meta-cognitive skills, reflective ability, and self-assessment skills), the LE comprised of three components aimed at “scaffolding” the learning: (1) An online "self-feedback" questionnaires that included drill-and-practice questions. Subsequent to responding the questions the online system provided a grade and the teachers were entitled to correct their answers; (2) Open-ended questions aimed at stimulating critical thinking about the mathematical contents; (3) Reflective questionnaires designed to assist the teachers in steering their learning. Using a mixed-method methodology, an inquiry study examined the learning processes, the learners' difficulties in reading the mathematical texts and on the unique contribution of each component of the LE to the ability of teachers to comprehend the mathematical contents, and support the development of their learning skills. The results indicate that the teachers found the online feedback as most helpful in developing self-regulated learning skills and ability to reflect on deficiencies in knowledge. Lacking previous experience in expressing opinion on mathematical ideas, the teachers had troubles in responding open-ended questions; however, they perceived this assignment as nurturing cognitive and meta-cognitive skills. The teachers also attested that the reflective questionnaires were useful for steering the learning. Although in general the teachers found the LE as supportive, most of them indicated the need to strengthen instructor-learners and learners-learners interactions. They suggested to generate an online forum to enable them receive direct feedback from the instructor, share ideas with other learners, and consult with them about solutions. Apparently, within online LE, supporting learning merely with respect to cognitive aspects is not sufficient. Leaners also need an emotional support and sense a social presence.Keywords: cognitive and meta-cognitive skills, independent reading of mathematical texts, online learning environment, self-regulated learning skills
Procedia PDF Downloads 6201132 Effect of the Cross-Sectional Geometry on Heat Transfer and Particle Motion of Circulating Fluidized Bed Riser for CO2 Capture
Authors: Seungyeong Choi, Namkyu Lee, Dong Il Shim, Young Mun Lee, Yong-Ki Park, Hyung Hee Cho
Abstract:
Effect of the cross-sectional geometry on heat transfer and particle motion of circulating fluidized bed riser for CO2 capture was investigated. Numerical simulation using Eulerian-eulerian method with kinetic theory of granular flow was adopted to analyze gas-solid flow consisting in circulating fluidized bed riser. Circular, square, and rectangular cross-sectional geometry cases of the same area were carried out. Rectangular cross-sectional geometries were analyzed having aspect ratios of 1: 2, 1: 4, 1: 8, and 1:16. The cross-sectional geometry significantly influenced the particle motion and heat transfer. The downward flow pattern of solid particles near the wall was changed. The gas-solid mixing degree of the riser with the rectangular cross section of the high aspect ratio was the lowest. There were differences in bed-to-wall heat transfer coefficient according to rectangular geometry with different aspect ratios.Keywords: bed geometry, computational fluid dynamics, circulating fluidized bed riser, heat transfer
Procedia PDF Downloads 2601131 Formulation of Optimal Shifting Sequence for Multi-Speed Automatic Transmission
Authors: Sireesha Tamada, Debraj Bhattacharjee, Pranab K. Dan, Prabha Bhola
Abstract:
The most important component in an automotive transmission system is the gearbox which controls the speed of the vehicle. In an automatic transmission, the right positioning of actuators ensures efficient transmission mechanism embodiment, wherein the challenge lies in formulating the number of actuators associated with modelling a gearbox. Data with respect to actuation and gear shifting sequence has been retrieved from the available literature, including patent documents, and has been used in this proposed heuristics based methodology for modelling actuation sequence in a gear box. This paper presents a methodological approach in designing a gearbox for the purpose of obtaining an optimal shifting sequence. The computational model considers factors namely, the number of stages and gear teeth as input parameters since these two are the determinants of the gear ratios in an epicyclic gear train. The proposed transmission schematic or stick diagram aids in developing the gearbox layout design. The number of iterations and development time required to design a gearbox layout is reduced by using this approach.Keywords: automatic transmission, gear-shifting, multi-stage planetary gearbox, rank ordered clustering
Procedia PDF Downloads 3251130 Numerical Study of Effects of Air Dam on the Flow Field and Pressure Distribution of a Passenger Car
Authors: Min Ye Koo, Ji Ho Ahn, Byung Il You, Gyo Woo Lee
Abstract:
Everything that is attached to the outside of the vehicle to improve the driving performance of the vehicle by changing the flow characteristics of the surrounding air or to pursue the external personality is called a tuning part. Typical tuning components include front or rear air dam, also known as spoilers, splitter, and side air dam. Particularly, the front air dam prevents the airflow flowing into the lower portion of the vehicle and increases the amount of air flow to the side and front of the vehicle body, thereby reducing lift force generation that lifts the vehicle body, and thus, improving the steering and driving performance of the vehicle. The purpose of this study was to investigate the role of anterior air dam in the flow around a sedan passenger car using computational fluid dynamics. The effects of flow velocity, trajectory of fluid particles on static pressure distribution and pressure distribution on body surface were investigated by varying flow velocity and size of air dam. As a result, it has been confirmed that the front air dam improves the flow characteristics, thereby reducing the generation of lift force of the vehicle, so it helps in steering and driving characteristics.Keywords: numerical study, air dam, flow field, pressure distribution
Procedia PDF Downloads 2051129 An Efficient Algorithm for Solving the Transmission Network Expansion Planning Problem Integrating Machine Learning with Mathematical Decomposition
Authors: Pablo Oteiza, Ricardo Alvarez, Mehrdad Pirnia, Fuat Can
Abstract:
To effectively combat climate change, many countries around the world have committed to a decarbonisation of their electricity, along with promoting a large-scale integration of renewable energy sources (RES). While this trend represents a unique opportunity to effectively combat climate change, achieving a sound and cost-efficient energy transition towards low-carbon power systems poses significant challenges for the multi-year Transmission Network Expansion Planning (TNEP) problem. The objective of the multi-year TNEP is to determine the necessary network infrastructure to supply the projected demand in a cost-efficient way, considering the evolution of the new generation mix, including the integration of RES. The rapid integration of large-scale RES increases the variability and uncertainty in the power system operation, which in turn increases short-term flexibility requirements. To meet these requirements, flexible generating technologies such as energy storage systems must be considered within the TNEP as well, along with proper models for capturing the operational challenges of future power systems. As a consequence, TNEP formulations are becoming more complex and difficult to solve, especially for its application in realistic-sized power system models. To meet these challenges, there is an increasing need for developing efficient algorithms capable of solving the TNEP problem with reasonable computational time and resources. In this regard, a promising research area is the use of artificial intelligence (AI) techniques for solving large-scale mixed-integer optimization problems, such as the TNEP. In particular, the use of AI along with mathematical optimization strategies based on decomposition has shown great potential. In this context, this paper presents an efficient algorithm for solving the multi-year TNEP problem. The algorithm combines AI techniques with Column Generation, a traditional decomposition-based mathematical optimization method. One of the challenges of using Column Generation for solving the TNEP problem is that the subproblems are of mixed-integer nature, and therefore solving them requires significant amounts of time and resources. Hence, in this proposal we solve a linearly relaxed version of the subproblems, and trained a binary classifier that determines the value of the binary variables, based on the results obtained from the linearized version. A key feature of the proposal is that we integrate the binary classifier into the optimization algorithm in such a way that the optimality of the solution can be guaranteed. The results of a study case based on the HRP 38-bus test system shows that the binary classifier has an accuracy above 97% for estimating the value of the binary variables. Since the linearly relaxed version of the subproblems can be solved with significantly less time than the integer programming counterpart, the integration of the binary classifier into the Column Generation algorithm allowed us to reduce the computational time required for solving the problem by 50%. The final version of this paper will contain a detailed description of the proposed algorithm, the AI-based binary classifier technique and its integration into the CG algorithm. To demonstrate the capabilities of the proposal, we evaluate the algorithm in case studies with different scenarios, as well as in other power system models.Keywords: integer optimization, machine learning, mathematical decomposition, transmission planning
Procedia PDF Downloads 851128 Numerical Study of Off-Design Performance of a Highly Loaded Low Pressure Turbine Cascade
Authors: Shidvash Vakilipour, Mehdi Habibnia, Rouzbeh Riazi, Masoud Mohammadi, Mohammad H. Sabour
Abstract:
The flow field passing through a highly loaded low pressure (LP) turbine cascade is numerically investigated at design and off-design conditions. The Field Operation And Manipulation (OpenFOAM) platform is used as the computational Fluid Dynamics (CFD) tool. Firstly, the influences of grid resolution on the results of k-ε, k-ω, and LES turbulence models are investigated and compared with those of experimental measurements. A numerical pressure under-shoot is appeared near the end of blade pressure surface which is sensitive to grid resolution and flow turbulence modeling. The LES model is able to resolve separation on a coarse and fine grid resolutions. Secondly, the off-design flow condition is modeled by negative and positive inflow incidence angles. The numerical experiments show that a separation bubble generated on blade pressure side is predicted by LES. The total pressure drop is also been calculated at incidence angle between -20◦ and +8◦. The minimum total pressure drop is obtained by k-ω and LES at the design point.Keywords: low pressure turbine, off-design performance, openFOAM, turbulence modeling, flow separation
Procedia PDF Downloads 3621127 Casusation and Criminal Responsibility
Authors: László Schmidt
Abstract:
“Post hoc ergo propter hoc” means after it, therefore because of it. In other words: If event Y followed event X, then event Y must have been caused by event X. The question of causation has long been a central theme in philosophical thought, and many different theories have been put forward. However, causality is an essentially contested concept (ECC), as it has no universally accepted definition and is used differently in everyday, scientific, and legal thinking. In the field of law, the question of causality arises mainly in the context of establishing legal liability: in criminal law and in the rules of civil law on liability for damages arising either from breach of contract or from tort. In the study some philosophical theories of causality will be presented and how these theories correlate with legal causality. It’s quite interesting when philosophical abstractions meet the pragmatic demands of jurisprudence. In Hungarian criminal judicial practice the principle of equivalence of conditions is the generally accepted and applicable standard of causation, where all necessary conditions are considered equivalent and thus a cause. The idea is that without the trigger, the subsequent outcome would not have occurred; all the conditions that led to the subsequent outcome are equivalent. In the case where the trigger that led to the result is accompanied by an additional intervening cause, including an accidental one, independent of the perpetrator, the causal link is not broken, but at most the causal link becomes looser. The importance of the intervening causes in the outcome should be given due weight in the imposition of the sentence. According to court practice if the conduct of the offender sets in motion the causal process which led to the result, it does not exclude his criminal liability and does not interrupt the causal process if other factors, such as the victim's illness, may have contributed to it. The concausa does not break the chain of causation, i.e. the existence of a causal link establish the criminal liability of the offender. Courts also adjudicates that if an act is a cause of the result if the act cannot be omitted without the result being omitted. This essentially assumes a hypothetical elimination procedure, i.e. the act must be omitted in thought and then examined to see whether the result would still occur or whether it would be omitted. On the substantive side, the essential condition for establishing the offence is that the result must be demonstrably connected with the activity committed. The provision on the assessment of the facts beyond reasonable doubt must also apply to the causal link: that is to say, the uncertainty of the causal link between the conduct and the result of the offence precludes the perpetrator from being held liable for the result. Sometimes, however, the courts do not specify in the reasons for their judgments what standard of causation they apply, i.e. on what basis they establish the existence of (legal) causation.Keywords: causation, Hungarian criminal law, responsibility, philosophy of law
Procedia PDF Downloads 391126 A Local Invariant Generalized Hough Transform Method for Integrated Circuit Visual Positioning
Authors: Wei Feilong
Abstract:
In this study, an local invariant generalized Houghtransform (LI-GHT) method is proposed for integrated circuit (IC) visual positioning. The original generalized Hough transform (GHT) is robust to external noise; however, it is not suitable for visual positioning of IC chips due to the four-dimensionality (4D) of parameter space which leads to the substantial storage requirement and high computational complexity. The proposed LI-GHT method can reduce the dimensionality of parameter space to 2D thanks to the rotational invariance of local invariant geometric feature and it can estimate the accuracy position and rotation angle of IC chips in real-time under noise and blur influence. The experiment results show that the proposed LI-GHT can estimate position and rotation angle of IC chips with high accuracy and fast speed. The proposed LI-GHT algorithm was implemented in IC visual positioning system of radio frequency identification (RFID) packaging equipment.Keywords: Integrated Circuit Visual Positioning, Generalized Hough Transform, Local invariant Generalized Hough Transform, ICpacking equipment
Procedia PDF Downloads 2641125 Recommendations for Teaching Word Formation for Students of Linguistics Using Computer Terminology as an Example
Authors: Svetlana Kostrubina, Anastasia Prokopeva
Abstract:
This research presents a comprehensive study of the word formation processes in computer terminology within English and Russian languages and provides listeners with a system of exercises for training these skills. The originality is that this study focuses on a comparative approach, which shows both general patterns and specific features of English and Russian computer terms word formation. The key point is the system of exercises development for training computer terminology based on Bloom’s taxonomy. Data contain 486 units (228 English terms from the Glossary of Computer Terms and 258 Russian terms from the Terminological Dictionary-Reference Book). The objective is to identify the main affixation models in the English and Russian computer terms formation and to develop exercises. To achieve this goal, the authors employed Bloom’s Taxonomy as a methodological framework to create a systematic exercise program aimed at enhancing students’ cognitive skills in analyzing, applying, and evaluating computer terms. The exercises are appropriate for various levels of learning, from basic recall of definitions to higher-order thinking skills, such as synthesizing new terms and critically assessing their usage in different contexts. Methodology also includes: a method of scientific and theoretical analysis for systematization of linguistic concepts and clarification of the conceptual and terminological apparatus; a method of nominative and derivative analysis for identifying word-formation types; a method of word-formation analysis for organizing linguistic units; a classification method for determining structural types of abbreviations applicable to the field of computer communication; a quantitative analysis technique for determining the productivity of methods for forming abbreviations of computer vocabulary based on the English and Russian computer terms, as well as a technique of tabular data processing for a visual presentation of the results obtained. a technique of interlingua comparison for identifying common and different features of abbreviations of computer terms in the Russian and English languages. The research shows that affixation retains its productivity in the English and Russian computer terms formation. Bloom’s taxonomy allows us to plan a training program and predict the effectiveness of the compiled program based on the assessment of the teaching methods used.Keywords: word formation, affixation, computer terms, Bloom's taxonomy
Procedia PDF Downloads 121124 Forced-Choice Measurement Models of Behavioural, Social, and Emotional Skills: Theory, Research, and Development
Authors: Richard Roberts, Anna Kravtcova
Abstract:
Introduction: The realisation that personality can change over the course of a lifetime has led to a new companion model to the Big Five, the behavioural, emotional, and social skills approach (BESSA). BESSA hypothesizes that this set of skills represents how the individual is thinking, feeling, and behaving when the situation calls for it, as opposed to traits, which represent how someone tends to think, feel, and behave averaged across situations. The five major skill domains share parallels with the Big Five Factor (BFF) model creativity and innovation (openness), self-management (conscientiousness), social engagement (extraversion), cooperation (agreeableness), and emotional resilience (emotional stability) skills. We point to noteworthy limitations in the current operationalisation of BESSA skills (i.e., via Likert-type items) and offer up a different measurement approach: forced choice. Method: In this forced-choice paradigm, individuals were given three skill items (e.g., managing my time) and asked to select one response they believed they were “worst at” and “best at”. The Thurstonian IRT models allow these to be placed on a normative scale. Two multivariate studies (N = 1178) were conducted with a 22-item forced-choice version of the BESSA, a published measure of the BFF, and various criteria. Findings: Confirmatory factor analysis of the forced-choice assessment showed acceptable model fit (RMSEA<0.06), while reliability estimates were reasonable (around 0.70 for each construct). Convergent validity evidence was as predicted (correlations between 0.40 and 0.60 for corresponding BFF and BESSA constructs). Notable was the extent the forced-choice BESSA assessment improved upon test-criterion relationships over and above the BFF. For example, typical regression models find BFF personality accounting for 25% of the variance in life satisfaction scores; both studies showed incremental gains over the BFF exceeding 6% (i.e., BFF and BESSA together accounted for over 31% of the variance in both studies). Discussion: Forced-choice measurement models offer up the promise of creating equated test forms that may unequivocally measure skill gains and are less prone to fakability and reference bias effects. Implications for practitioners are discussed, especially those interested in selection, succession planning, and training and development. We also discuss how the forced choice method can be applied to other constructs like emotional immunity, cross-cultural competence, and self-estimates of cognitive ability.Keywords: Big Five, forced-choice method, BFF, methods of measurements
Procedia PDF Downloads 941123 Conceptualizing Personalized Learning: Review of Literature 2007-2017
Authors: Ruthanne Tobin
Abstract:
As our data-driven, cloud-based, knowledge-centric lives become ever more global, mobile, and digital, educational systems everywhere are struggling to keep pace. Schools need to prepare students to become critical-thinking, tech-savvy, life-long learners who are engaged and adaptable enough to find their unique calling in a post-industrial world of work. Recognizing that no nation can afford poor achievement or high dropout rates without jeopardizing its social and economic future, the thirty-two nations of the OECD are launching initiatives to redesign schools, generally under the banner of Personalized Learning or 21st Century Learning. Their intention is to transform education by situating students as co-enquirers and co-contributors with their teachers of what, when, and how learning happens for each individual. In this focused review of the 2007-2017 literature on personalized learning, the author sought answers to two main questions: “What are the theoretical frameworks that guide personalized learning?” and “What is the conceptual understanding of the model?” Ultimately, the review reveals that, although the research area is overly theorized and under-substantiated, it does provide a significant body of knowledge about this potentially transformative educational restructuring. For example, it addresses the following questions: a) What components comprise a PL model? b) How are teachers facilitating agency (voice & choice) in their students? c) What kinds of systems, processes and procedures are being used to guide the innovation? d) How is learning organized, monitored and assessed? e) What role do inquiry based models play? f) How do teachers integrate the three types of knowledge: Content, pedagogical and technological? g) Which kinds of forces enable, and which impede, personalizing learning? h) What is the nature of the collaboration among teachers? i) How do teachers co-regulate differentiated tasks? One finding of the review shows that while technology can dramatically expand access to information, expectations of its impact on teaching and learning are often disappointing unless the technologies are paired with excellent pedagogies in order to address students’ needs, interests and aspirations. This literature review fills a significant gap in this emerging field of research, as it serves to increase conceptual clarity that has hampered both the theorizing and the classroom implementation of a personalized learning model.Keywords: curriculum change, educational innovation, personalized learning, school reform
Procedia PDF Downloads 2231122 CFD Simulation and Experimental Validation of the Bubble-Induced Flow during Electrochemical Water Splitting
Authors: Gabriel Wosiak, Jeyse da Silva, Sthefany S. Sena, Renato N. de Andrade, Ernesto Pereira
Abstract:
The bubble formation during hydrogen production by electrolysis and several electrochemical processes is an inherent phenomenon and can impact the energy consumption of the processes. In this work, it was reported both experimental and computational results describe the effect of bubble displacement, which, under the cases investigated, leads to the formation of a convective flow in the solution. The process is self-sustained, and a solution vortex is formed, which modifies the bubble growth and covering at the electrode surface. Using the experimental data, we have built a model to simulate it, which, with high accuracy, describes the phenomena. Then, it simulated many different experimental conditions and evaluated the effects of the boundary conditions on the bubble surface covering the surface. We have observed a position-dependent bubble covering the surface, which has an effect on the water-splitting efficiency. It was shown that the bubble covering is not uniform at the electrode surface, and using statistical analysis; it was possible to evaluate the influence of the gas type (H2 and O2), current density, and the bubble size (and cross-effects) on the covering fraction and the asymmetric behavior over the electrode surface.Keywords: water splitting, bubble, electrolysis, hydrogen production
Procedia PDF Downloads 1001121 CFD Analysis of Ammonia/Hydrogen Combustion Performance under Partially Premixed and Non-premixed Modes with Varying Inlet Characteristics
Authors: Maria Alekxandra B. Sison, Reginald C. Mallare, Joseph Albert M. Mendoza
Abstract:
Ammonia (NH₃) is the alternative carbon-free fuel of the future for its promising applications. Investigations on NH₃-fuel blends recommend using hydrogen (H₂) to increase the heating value of NH3, promote combustion performance, and improve NOx efflux mitigation. To further examine the effects of this concept, the study analyzed the combustion performance, in terms of turbulence, combustion efficiency (CE), and NOx emissions, of NH3/fuel with variations of combustor diameter ratio, H2 fuel mole fraction, and fuel mass flow rate (ṁ). The simulations were performed using Computational Fluid Dynamics (CFD) modeling to represent a non-premixed (NP) and partially premixed (PP) combustion under a two-dimensional ultra-low NOx Rich-Burn, Quick-Quench, Lean-Burn (RQL) combustor. Governed by the Detached Eddy Simulation model, it was found that the diameter ratio greatly affects the turbulence in PP and NP mode, whereas ṁ in PP should be prioritized when increasing CE. The NOx emission is minimal during PP combustion, but NP combustion suggested modifying ṁ to achieve higher CE and Reynolds number without sacrificing the NO generation from the reaction.Keywords: combustion efficiency, turbulence, dual-stage combustor, NOx emission
Procedia PDF Downloads 1041120 EMI Shielding in Carbon Based Nanocomposites
Authors: Mukul Kumar Srivastava, Sumit Basu
Abstract:
Carbon fiber reinforced polymer (CFRP) composites find wide use in the space and aerospace industries primarily due to their favourable strength-to-weight ratios. However, in spite of the impressive mechanical properties, their ability to shield sophisticated electronics from electromagnetic interference (EMI) is rather limited. As a result, metallic wire meshes or metal foils are often embedded in CFRP composites to provide adequate EMI shielding. This comes at additional manufacturing cost, increased weight and, particularly in cases of aluminium, increased risk of galvanic corrosion in the presence of moisture. In this work, we will explore ways of enhancing EMI shielding of CFRP laminates in the 8-12 GHz range (the so-called X-band), without compromising their mechanical and fracture properties, through minimal modifications to their current well-established fabrication protocol. The computational-experimental study of EMI shielding in CFRP laminates will focus on the effects of incorporating multiwalled carbon nanotubes (MWCNT) and conducting nanoparticles in different ways in the resin and/or carbon fibers. We will also explore the possibility of utilising the excellent absorbing properties of MWCNT reinforced polymer foams to enhance the overall EMI shielding capabilities.Keywords: EMI shielding, X-band, CFRP, MWCNT
Procedia PDF Downloads 831119 Virtual Academy Next: Addressing Transition Challenges Through a Gamified Virtual Transition Program for Students with Disabilities
Authors: Jennifer Gallup, Joel Bocanegra, Greg Callan, Abigail Vaughn
Abstract:
Students with disabilities (SWD) engaged in a distance summer program delivered over multiple virtual mediums that used gaming principles to teach and practice self-regulated learning (SRL) through the process of exploring possible jobs. Gaming quests were developed to explore jobs and teach transition skills. Students completed specially designed quests that taught and reinforced SRL and problem-solving through individual, group, and teacher-led experiences. SRL skills learned were reinforced through guided job explorations over the context of MinecraftEDU, zoom with experts in the career, collaborations with a team over Marco Polo, and Zoom. The quests were developed and laid out on an accessible web page, with active learning opportunities and feedback conducted within multiple virtual mediums including MinecraftEDU. Gaming mediums actively engage players in role-playing, problem-solving, critical thinking, and collaboration. Gaming has been used as a medium for education since the inception of formal education. Games, and specifically board games, are pre-historic, meaning we had board games before we had written language. Today, games are widely used in education, often as a reinforcer for behavior or for rewards for work completion. Games are not often used as a direct method of instruction and assessment; however, the inclusion of games as an assessment tool and as a form of instruction increases student engagement and participation. Games naturally include collaboration, problem-solving, and communication. Therefore, our summer program was developed using gaming principles and MinecraftEDU. This manuscript describes a virtual learning summer program called Virtual Academy New and Exciting Transitions (VAN) that was redesigned from a face-to-face setting to a completely online setting with a focus on SWD aged 14-21. The focus of VAN was to address transition planning needs such as problem-solving skills, self-regulation, interviewing, job exploration, and communication for transition-aged youth diagnosed with various disabilities (e.g., learning disabilities, attention-deficit hyperactivity disorder, intellectual disability, down syndrome, autism spectrum disorder).Keywords: autism, disabilities, transition, summer program, gaming, simulations
Procedia PDF Downloads 751118 A Metaheuristic Approach for the Pollution-Routing Problem
Authors: P. Parthiban, Sonu Rajak, R. Dhanalakshmi
Abstract:
This paper presents an Ant Colony Optimization (ACO) approach, combined with a Speed Optimization Algorithm (SOA) to solve the Vehicle Routing Problem (VRP) with environmental considerations, which is well known as Pollution-Routing Problem (PRP). It consists of routing a number of vehicles to serve a set of customers, and determining fuel consumption, driver wages and their speed on each route segment, while respecting the capacity constraints and time windows. Since VRP is NP-hard problem, so PRP also a NP-hard problem, which requires metaheuristics to solve this type of problems. The proposed solution method consists of two stages. Stage one is to solve a Vehicle Routing Problem with Time Window (VRPTW) using ACO and in the second stage, a SOA is run on the resulting VRPTW solution. Given a vehicle route, the SOA consists of finding the optimal speed on each arc of the route to minimize an objective function comprising fuel consumption costs and driver wages. The proposed algorithm tested on benchmark problem, the preliminary results show that the proposed algorithm can provide good solutions within reasonable computational time.Keywords: ant colony optimization, CO2 emissions, speed optimization, vehicle routing
Procedia PDF Downloads 3591117 Temperature Rises Characteristics of Distinct Double-Sided Flat Permanent Magnet Linear Generator for Free Piston Engines for Hybrid Vehicles
Authors: Ismail Rahama Adam Hamid
Abstract:
This paper presents the development of a thermal model for a flat, double-sided linear generator designed for use in free-piston engines. The study conducted in this paper examines the influence of temperature on the performance of the permeant magnet linear generator, an integral and pivotal component within the system. This research places particular emphasis on the Neodymium Iron Boron (NdFeB) permanent magnet, which serves as a source of magnetic field for the linear generator. In this study, an internal combustion engine that tends to produce heat is connected to a generator. Considering the temperatures rise from both the combustion process and the thermal contributions of current-carrying conductors and frictional forces. Utilizing Computational Fluid Dynamics (CFD) method, a thermal model of the (NdFeB) magnet within the linear generator is constructed and analyzed. Furthermore, the temperature field is examined to ensure that the linear generator operates under stable conditions without the risk of demagnetization.Keywords: free piston engine, permanent magnet, linear generator, demagnetization, simulation
Procedia PDF Downloads 561116 Application of Neuroscience in Aligning Instructional Design to Student Learning Style
Authors: Jayati Bhattacharjee
Abstract:
Teaching is a very dynamic profession. Teaching Science is as much challenging as Learning the subject if not more. For instance teaching of Chemistry. From the introductory concepts of subatomic particles to atoms of elements and their symbols and further presenting the chemical equation and so forth is a challenge on both side of the equation Teaching Learning. This paper combines the Neuroscience of Learning and memory with the knowledge of Learning style (VAK) and presents an effective tool for the teacher to authenticate Learning. The model of ‘Working Memory’, the Visio-spatial sketchpad, the central executive and the phonological loop that transforms short-term memory to long term memory actually supports the psychological theory of Learning style i.e. Visual –Auditory-Kinesthetic. A closer examination of David Kolbe’s learning model suggests that learning requires abilities that are polar opposites, and that the learner must continually choose which set of learning abilities he or she will use in a specific learning situation. In grasping experience some of us perceive new information through experiencing the concrete, tangible, felt qualities of the world, relying on our senses and immersing ourselves in concrete reality. Others tend to perceive, grasp, or take hold of new information through symbolic representation or abstract conceptualization – thinking about, analyzing, or systematically planning, rather than using sensation as a guide. Similarly, in transforming or processing experience some of us tend to carefully watch others who are involved in the experience and reflect on what happens, while others choose to jump right in and start doing things. The watchers favor reflective observation, while the doers favor active experimentation. Any lesson plan based on the model of Prescriptive design: C+O=M (C: Instructional condition; O: Instructional Outcome; M: Instructional method). The desired outcome and conditions are independent variables whereas the instructional method is dependent hence can be planned and suited to maximize the learning outcome. The assessment for learning rather than of learning can encourage, build confidence and hope amongst the learners and go a long way to replace the anxiety and hopelessness that a student experiences while learning Science with a human touch in it. Application of this model has been tried in teaching chemistry to high school students as well as in workshops with teachers. The response received has proven the desirable results.Keywords: working memory model, learning style, prescriptive design, assessment for learning
Procedia PDF Downloads 3511115 Study of Fire Propagation and Soot Flow in a Pantry Car of Railway Locomotive
Authors: Juhi Kaushik, Abhishek Agarwal, Manoj Sarda, Vatsal Sanjay, Arup Kumar Das
Abstract:
Fire accidents in trains bring huge disaster to human life and property. Evacuation becomes a major challenge in such incidents owing to confined spaces, large passenger density and trains moving at high speeds. The pantry car in Indian Railways trains carry inflammable materials like cooking fuel and LPG and electrical fittings. The pantry car is therefore highly susceptible to fire accidents. Numerical simulations have been done in a pantry car of Indian locomotive train using computational fluid dynamics based software. Different scenarios of a fire outbreak have been explored by varying Heat Release Rate per Unit Area (HRRPUA) of the fire source, introduction of exhaust in the cooking area, and taking a case of an air conditioned pantry car. Temporal statures of flame and soot have been obtained for each scenario and differences have been studied and reported. Inputs from this study can be used to assess casualties in fire accidents in locomotive trains and development of smoke control/detection systems in Indian trains.Keywords: fire propagation, flame contour, pantry fire, soot flow
Procedia PDF Downloads 3391114 Investigation of Different Conditions to Detect Cycles in Linearly Implicit Quantized State Systems
Authors: Elmongi Elbellili, Ben Lauwens, Daan Huybrechs
Abstract:
The increasing complexity of modern engineering systems presents a challenge to the digital simulation of these systems which usually can be represented by differential equations. The Linearly Implicit Quantized State System (LIQSS) offers an alternative approach to traditional numerical integration techniques for solving Ordinary Differential Equations (ODEs). This method proved effective for handling discontinuous and large stiff systems. However, the inherent discrete nature of LIQSS may introduce oscillations that result in unnecessary computational steps. The current oscillation detection mechanism relies on a condition that checks the significance of the derivatives, but it could be further improved. This paper describes a different cycle detection mechanism and presents the outcomes using LIQSS order one in simulating the Advection Diffusion problem. The efficiency of this new cycle detection mechanism is verified by comparing the performance of the current solver against the new version as well as a reference solution using a Runge-Kutta method of order14.Keywords: numerical integration, quantized state systems, ordinary differential equations, stiffness, cycle detection, simulation
Procedia PDF Downloads 601113 On Radially Symmetric Vibrations of Bi-Directional Functionally Graded Circular Plates on the Basis of Mindlin’s Theory and Neutral Axis
Authors: Rahul Saini, Roshan Lal
Abstract:
The present paper deals with the free axisymmetric vibrations of bi-directional functionally graded circular plates using Mindlin’s plate theory and physical neutral surface. The temperature-dependent, as well as temperature-independent mechanical properties of the plate material, varies in radial and transverse directions. Also, temperature profile for one- and two-dimensional temperature variations has been obtained from the heat conduction equation. A simple computational formulation for the governing differential equation of motion for such a plate model has been derived using Hamilton's principle for the clamped and simply supported plates at the periphery. Employing the generalized differential quadrature method, the corresponding frequency equations have been obtained and solved numerically to retain their lowest three roots as the natural frequencies for the first three modes. The effect of various other parameters such as temperature profile, functionally graded indices, and boundary conditions on the vibration characteristics has been presented. In order to validate the accuracy and efficiency of the method, the results have been compared with those available in the literature.Keywords: bi-directionally FG, GDQM, Mindlin’s circular plate, neutral axis, vibrations
Procedia PDF Downloads 1301112 The Relations Between Hans Kelsen’s Concept of Law and the Theory of Democracy
Authors: Monika Zalewska
Abstract:
Hans Kelsen was a versatile legal thinker whose achievements in the fields of legal theory, international law, and the theory of democracy are remarkable. All of the fields tackled by Kelsen are regarded as part of his “pure theory of law.” While the link between international law and Kelsen’s pure theory of law is apparent, the same cannot be said about the link between the theory of democracy and his pure theory of law. On the contrary, the general thinking concerning Kelsen’s thought is that it can be used to legitimize authoritarian regimes. The aim of this presentation is to address this concern by identifying the common ground between Kelsen’s pure theory of law and his theory of democracy and to show that they are compatible in a way that his pure theory of law and authoritarianism cannot be. The conceptual analysis of the purity of Kelsen’s theory and his goal of creating ideology-free legal science hints at how Kelsen’s pure theory of law and the theory of democracy are brought together. The presentation will first demonstrate that these two conceptions have common underlying values and meta-ethical convictions. Both are founded on relativism and a rational worldview, and the aim of both is peaceful co-existence. Second, it will be demonstrated that the separation of law and morality provides the maximum space for deliberation within democratic processes. The conclusion of this analysis is that striking similarities exist between Kelsen’s legal theory and his theory of democracy. These similarities are grounded in the Enlightenment tradition and its values, including rationality, a scientific worldview, tolerance, and equality. This observation supports the claim that, for Kelsen, legal positivism and the theory of democracy are not two separate theories but rather stem from the same set of values and from Kelsen’s relativistic worldview. Furthermore, three main issues determine Kelsen’s orientation toward a positivistic and democratic outlook. The first, which is associated with personality type, is the distinction between absolutism and relativism. The second, which is associated with the values that Kelsen favors in the social order, is peace. The third is legality, which creates the necessary condition for democracy to thrive and reveals that democracy is capable of fulfilling Kelsen’s ideal of law at its fullest. The first two categories exist in the background of Kelsen’s pure theory of law, while the latter is an inherent part of Kelsen’s concept of law. The analysis of the text concerning natural law doctrine and democracy indicates that behind the technical language of Kelsen’s pure theory of law is a strong concern with the trends that appeared after World War I. Despite his rigorous scientific mind, Kelsen was deeply humanistic. He tried to create a powerful intellectual weapon to provide strong arguments for peaceful coexistence and a rational outlook in Europe. The analysis provided by this presentation facilitates a broad theoretical, philosophical, and political understanding of Kelsen’s perspectives and, consequently, urges a strong endorsement of Kelsen’s approach to constitutional democracy.Keywords: hans kelsen, democracy, legal positivism, pure theory of law
Procedia PDF Downloads 109