Search results for: MODELICA simulation language
5085 Acculturation of Iranian Students in Europe
Authors: Shirin Sadat Ahmadi
Abstract:
The number of people, particularly university students, migrating from Iran and applying for American and European universities has been rising during recent years. Different people may have various reasons and goals for migration, but one of the common issues among all these people is the cultural challenges they experience when living in the adopted society. Immigrants usually confront obstacles during the Intercultural transition and adaption process. Different variables such as age, religion, gender, education, knowing the spoken language in destination country, financial condition, interactions with natives, and using social media can affect the cultural challenges people face after migration and how they conquer issues appearing due to intercultural differences and conflicts. In this research we have interviewed a sample consisted of 15 Iranian students living and studying abroad found by using snowball sampling technique via video call to realize what cultural challenges they have experienced in the new society, how the mentioned variables above eased these challenges or made them harder and what approaches and solutions they adopted to adjust themselves to the new society and its cultural dimensions. Based on John Berry's acculturation theory of migrant-host relationship, we have classified these 15 people in five different categories: Assimilation, Separation, Marginalization, and Integration. In addition we have considered Y.Y. Kim's communication-based theory of cross-cultural adaption to explain how communications helped migrant populations in adaption process. Based on the findings of this study, 12 of 15 interviewed members of the study used the integration strategy to adapt to the new cultural environment, 3 of them used the assimilation strategy, and none of them used marginalization or separation strategies. Communicating with natives, knowing the language, and education were the factors that helped all the interviewed members of the sample to overcome the difficulties of intercultural transition.Keywords: acculturation, culture, intercultural transition, migration
Procedia PDF Downloads 655084 Energy Storage in the Future of Ethiopia Renewable Electricity Grid System
Authors: Dawit Abay Tesfamariam
Abstract:
Ethiopia’s Climate- Resilient Green Economy strategy focuses mainly on generating and utilization of Renewable Energy (RE). The data collected in 2016 by Ethiopian Electric Power (EEP) indicates that the intermittent RE sources on the grid from solar and wind energy were only 8 % of the total energy produced. On the other hand, the EEP electricity generation plan in 2030 indicates that 36 % of the energy generation share will be covered by solar and wind sources. Thus, a case study was initiated to model and compute the balance and consumption of electricity in three different scenarios: 2016, 2025, and 2030 using the Energy PLAN Model (EPM). Initially, the model was validated using the 2016 annual power-generated data to conduct the EPM analysis for two predictive scenarios. The EPM simulation analysis using EPM for 2016 showed that there was no significant excess power generated. Hence, the model’s results are in line with the actual 2016 output. Thus, the EPM was applied to analyze the role of energy storage in RE in Ethiopian grid systems. The results of the EPM simulation analysis showed there will be excess production of 402 /7963 MW average and maximum, respectively, in 2025. The excess power was dominant in all months except in the three rainy months of the year (June, July, and August). Consequently, based on the validated outcomes of EPM indicates, there is a good reason to think about other alternatives for the utilization of excess energy and storage of RE. Thus, from the scenarios and model results obtained, it is realistic to infer that; if the excess power is utilized with a storage mechanism that can stabilize the grid system; as a result, the extra RE generated can be exported to support the economy. Therefore, researchers must continue to upgrade the current and upcoming energy storage system to synchronize with RE potentials that can be generated from RE.Keywords: renewable energy, storage, wind, energyplan
Procedia PDF Downloads 835083 Comparative Study of Free Vibrational Analysis and Modes Shapes of FSAE Car Frame Using Different FEM Modules
Authors: Rajat Jain, Himanshu Pandey, Somesh Mehta, Pravin P. Patil
Abstract:
Formula SAE cars are the student designed and fabricated formula prototype cars, designed according to SAE INTERNATIONAL design rules which compete in the various national and international events. This paper shows a FEM based comparative study of free vibration analysis of different mode shapes of a formula prototype car chassis frame. Tubing sections of different diameters as per the design rules are designed in such a manner that the desired strength can be achieved. Natural frequency of first five mode was determined using finite element analysis method. SOLIDWORKS is used for designing the frame structure and SOLIDWORKS SIMULATION and ANSYS WORKBENCH 16.2 are used for the modal analysis. Mode shape results of ANSYS and SOLIDWORKS were compared. Fixed –fixed boundary conditions are used for fixing the A-arm wishbones. The simulation results were compared for the validation of the study. First five modes were compared and results were found within the permissible limits. The AISI4130 (CROMOLY- chromium molybdenum steel) material is used and the chassis frame is discretized with fine quality QUAD mesh followed by Fixed-fixed boundary conditions. The natural frequency of the chassis frame is 53.92-125.5 Hz as per the results of ANSYS which is found within the permissible limits. The study is concluded with the light weight and compact chassis frame without compensation with strength. This design allows to fabricate an extremely safe driver ergonomics, compact, dynamically stable, simple and light weight tubular chassis frame with higher strength.Keywords: FEM, modal analysis, formula SAE cars, chassis frame, Ansys
Procedia PDF Downloads 3505082 Modeling and Simulating Productivity Loss Due to Project Changes
Authors: Robert Pellerin, Michel Gamache, Remi Trudeau, Nathalie Perrier
Abstract:
The context of large engineering projects is particularly favorable to the appearance of engineering changes and contractual modifications. These elements are potential causes for claims. In this paper, we investigate one of the critical components of the claim management process: the calculation of the impacts of changes in terms of losses of productivity due to the need to accelerate some project activities. When project changes are initiated, delays can arise. Indeed, project activities are often executed in fast-tracking in an attempt to respect the completion date. But the acceleration of project execution and the resulting rework can entail important costs as well as induce productivity losses. In the past, numerous methods have been proposed to quantify the duration of delays, the gains achieved by project acceleration, and the loss of productivity. The calculation related to those changes can be divided into two categories: direct cost and indirect cost. The direct cost is easily quantifiable as opposed to indirect costs which are rarely taken into account during the calculation of the cost of an engineering change or contract modification despite several research projects have been made on this subject. However, proposed models have not been accepted by companies yet, nor they have been accepted in court. Those models require extensive data and are often seen as too specific to be used for all projects. These techniques are also ignoring the resource constraints and the interdependencies between the causes of delays and the delays themselves. To resolve this issue, this research proposes a simulation model that mimics how major engineering changes or contract modifications are handled in large construction projects. The model replicates the use of overtime in a reactive scheduling mode in order to simulate the loss of productivity present when a project change occurs. Multiple tests were conducted to compare the results of the proposed simulation model with statistical analysis conducted by other researchers. Different scenarios were also conducted in order to determine the impact the number of activities, the time of occurrence of the change, the availability of resources, and the type of project changes on productivity loss. Our results demonstrate that the number of activities in the project is a critical variable influencing the productivity of a project. When changes occur, the presence of a large number of activities leads to a much lower productivity loss than a small number of activities. The speed of reducing productivity for 30-job projects is about 25 percent faster than the reduction speed for 120-job projects. The moment of occurrence of a change also shows a significant impact on productivity. Indeed, the sooner the change occurs, the lower the productivity of the labor force. The availability of resources also impacts the productivity of a project when a change is implemented. There is a higher loss of productivity when the amount of resources is restricted.Keywords: engineering changes, indirect costs overtime, productivity, scheduling, simulation
Procedia PDF Downloads 2415081 Simulated Mechanical Analysis on Hydroxyapatite Coated Porous Polylactic Acid Scaffold for Bone Grafting
Authors: Ala Abobakr Abdulhafidh Al-Dubai
Abstract:
Bone loss has risen due to fractures, surgeries, and traumatic injuries. Scientists and engineers have worked over the years to find solutions to heal and accelerate bone regeneration. The bone grafting technique has been utilized, which projects significant improvement in the bone regeneration area. An extensive study is essential on the relation between the mechanical properties of bone scaffolds and the pore size of the scaffolds, as well as the relation between the mechanical properties of bone scaffolds with the development of bioactive coating on the scaffolds. In reducing the cost and time, a mechanical simulation analysis is beneficial to simulate both relations. Therefore, this study highlights the simulated mechanical analyses on three-dimensional (3D) polylactic acid (PLA) scaffolds at two different pore sizes (P: 400 and 600 μm) and two different internals distances of (D: 600 and 900 μm), with and without the presence of hydroxyapatite (HA) coating. The 3D scaffold models were designed using SOLIDWORKS software. The respective material properties were assigned with the fixation of boundary conditions on the meshed 3D models. Two different loads were applied on the PLA scaffolds, including side loads of 200 N and vertical loads of 2 kN. While only vertical loads of 2 kN were applied on the HA coated PLA scaffolds. The PLA scaffold P600D900, which has the largest pore size and maximum internal distance, generated the minimum stress under the applied vertical load. However, that same scaffold became weaker under the applied side load due to the high construction gap between the pores. The development of HA coating on top of the PLA scaffolds induced greater stress generation compared to the non-coated scaffolds which is tailorable for bone implantation. This study concludes that the pore size and the construction of HA coating on bone scaffolds affect the mechanical strength of the bone scaffolds.Keywords: hydroxyapatite coating, bone scaffold, mechanical simulation, three-dimensional (3D), polylactic acid (PLA).
Procedia PDF Downloads 635080 Towards Designing of a Potential New HIV-1 Protease Inhibitor Using Quantitative Structure-Activity Relationship Study in Combination with Molecular Docking and Molecular Dynamics Simulations
Authors: Mouna Baassi, Mohamed Moussaoui, Hatim Soufi, Sanchaita RajkhowaI, Ashwani Sharma, Subrata Sinha, Said Belaaouad
Abstract:
Human Immunodeficiency Virus type 1 protease (HIV-1 PR) is one of the most challenging targets of antiretroviral therapy used in the treatment of AIDS-infected people. The performance of protease inhibitors (PIs) is limited by the development of protease mutations that can promote resistance to the treatment. The current study was carried out using statistics and bioinformatics tools. A series of thirty-three compounds with known enzymatic inhibitory activities against HIV-1 protease was used in this paper to build a mathematical model relating the structure to the biological activity. These compounds were designed by software; their descriptors were computed using various tools, such as Gaussian, Chem3D, ChemSketch and MarvinSketch. Computational methods generated the best model based on its statistical parameters. The model’s applicability domain (AD) was elaborated. Furthermore, one compound has been proposed as efficient against HIV-1 protease with comparable biological activity to the existing ones; this drug candidate was evaluated using ADMET properties and Lipinski’s rule. Molecular Docking performed on Wild Type and Mutant Type HIV-1 proteases allowed the investigation of the interaction types displayed between the proteases and the ligands, Darunavir (DRV) and the new drug (ND). Molecular dynamics simulation was also used in order to investigate the complexes’ stability, allowing a comparative study of the performance of both ligands (DRV & ND). Our study suggested that the new molecule showed comparable results to that of Darunavir and may be used for further experimental studies. Our study may also be used as a pipeline to search and design new potential inhibitors of HIV-1 proteases.Keywords: QSAR, ADMET properties, molecular docking, molecular dynamics simulation.
Procedia PDF Downloads 435079 The Diglossia and the Bilingualism: Concept, Problems, and Solutions
Authors: Abdou Mahmoud Abdou Hussein
Abstract:
We attempt, in this paper, to spot the light on the difference between the two concepts (diglossia and bilingualism). Thus, we will show the definition of these two concepts among various perspectives. On the other hand, we will emphasize and highlight 'diglossa' in The Arabic language historically. Furthermore, we will illustrate the factors of the diglossia, the impact of diglossia on the learners of Arabic (native and non native speakers) and finally the suggested solutions for this issue.Keywords: Arabic linguistics, diglossia, bilingualism, native and non-native speakers
Procedia PDF Downloads 4125078 Reaching Students Who “Don’t Like Writing” through Scenario Based Learning
Authors: Shahira Mahmoud Yacout
Abstract:
Writing is an essential skill in many vocational, academic environments, and notably workplaces, yet many students perceive writing as being something tiring and boring or maybe a “waste of time”. Studies in the field of foreign languages related this fact might be due to the lack of connection between what is learned in the university and what students come to encounter in real life situations”. Arabic learners felt they needed more language exposure to the context of their future professions. With this idea in mind, Scenario based learning (SBL) is reported to be an educational approach to motivate, engage and stimulate students’ interest and to achieve the desired writing learning outcomes. In addition, researchers suggested Scenario based learning (SBL)as an instructional approach that develops and enhances students skills through developing higher order thinking skills and active learning. It is a subset of problem-based learning and case-based learning. The approach focuses on authentic rhetorical framing reflecting writing tasks in real life situations. It works successfully when used to simulate real-world practices, providing context that reflects the types of situations professionals respond to in writing. It was claimed that using realistic scenarios customized to the course’s learning objectives as it bridged the gap for students between theory and application. Within this context, it is thought that scenario-based learning is an important approach to enhance the learners’ writing skills and to reflect meaningful learning within authentic contexts. As an Arabicforeign language instructor, it was noticed that students find difficulties in adapting writing styles to authentic writing contexts and addressing different audiences and purposes. This idea is supported by studieswho claimed that AFL students faced difficulties with transferring writing skills to situations outside of the classroom context. In addition, it was observed that some of the Arabic textbooks for teaching Arabic as a foreign language lacked topics that initiated higher order thinking skills and stimulated the learners to understand the setting, and created messages appropriate to different audiences, context, and purposes. The goals of this study are to 1)provide a rational for using scenario-based learning approach to improveAFL learners in writing skills, 2) demonstrate how to design/ implement a scenario-based learning technique aligned with the writing course objectives,3) demonstrate samples of scenario-based approach implemented in AFL writing class, and 4)emphasis the role of peer-review along with the instructor’s feedback, in the process of developing the writing skill. Finally, this presentation highlighted and emphasized the importance of using the scenario-based learning approach in writing as a means to mirror students’ real-life situations and engage them in planning, monitoring, and problem solving. This approach helped in making writing an enjoyable experience and clearly useful to students’ future professional careers.Keywords: meaningful learning, real life contexts, scenario based learning, writing skill
Procedia PDF Downloads 1015077 AI-Assisted Business Chinese Writing: Comparing the Textual Performances Between Independent Writing and Collaborative Writing
Authors: Stephanie Liu Lu
Abstract:
With the proliferation of artificial intelligence tools in the field of education, it is crucial to explore their impact on language learning outcomes. This paper examines the use of AI tools, such as ChatGPT, in practical writing within business Chinese teaching to investigate how AI can enhance practical writing skills and teaching effectiveness. The study involved third and fourth-year university students majoring in accounting and finance from a university in Hong Kong within the context of a business correspondence writing class. Students were randomly assigned to a control group, who completed business letter writing independently, and an experimental group, who completed the writing with the assistance of AI. In the latter, the AI-assisted business letters were initially drafted by the students issuing commands and interacting with the AI tool, followed by the students' revisions of the draft. The paper assesses the performance of both groups in terms of grammatical expression, communicative effect, and situational awareness. Additionally, the study collected dialogue texts from interactions between students and the AI tool to explore factors that affect text generation and the potential impact of AI on enhancing students' communicative and identity awareness. By collecting and comparing textual performances, it was found that students assisted by AI showed better situational awareness, as well as more skilled organization and grammar. However, the research also revealed that AI-generated articles frequently lacked a proper balance of identity and writing purpose due to limitations in students' communicative awareness and expression during the instruction and interaction process. Furthermore, the revision of drafts also tested the students' linguistic foundation, logical thinking abilities, and practical workplace experience. Therefore, integrating AI tools and related teaching into the curriculum is key to the future of business Chinese teaching.Keywords: AI-assistance, business Chinese, textual analysis, language education
Procedia PDF Downloads 605076 Plasma Ion Implantation Study: A Comparison between Tungsten and Tantalum as Plasma Facing Components
Authors: Tahreem Yousaf, Michael P. Bradley, Jerzy A. Szpunar
Abstract:
Currently, nuclear fusion is considered one of the most favorable options for future energy generation, due both to its abundant fuel and lack of emissions. For fusion power reactors, a major problem will be a suitable material choice for the Plasma Facing Components (PFCs) which will constitute the reactor first wall. Tungsten (W) has advantages as a PFC material because of its high melting point, low vapour pressure, high thermal conductivity and low retention of hydrogen isotopes. However, several adverse effects such as embrittlement, melting and morphological evolution have been observed in W when it is bombarded by low-energy and high-fluence helium (He) and deuterium (D) ions, as a simulation conditions adjacent to a fusion plasma. Recently, tantalum (Ta) also investigate as PFC and show better reluctance to nanostructure fuzz as compared to W under simulated fusion plasma conditions. But retention of D ions found high in Ta than W. Preparatory to plasma-based ion implantation studies, the effect of D and He ion impact on W and Ta is predicted by using the stopping and range of ions in the matter (SRIM) code. SRIM provided some theoretical results regarding projected range, ion concentration (at. %) and displacement damage (dpa) in W and Ta. The projected range for W under Irradiation of He and D ions with an energy of 3-keV and 1×fluence is determined 75Å and 135 Å and for Ta 85Å and 155Å, respectively. For both W and Ta samples, the maximum implanted peak for helium is predicted ~ 5.3 at. % at 12 nm and for De ions concentration peak is located near 3.1 at. % at 25 nm. For the same parameters, the displacement damage for He ions is observed in W ~ 0.65 dpa and Ta ~ 0.35 dpa at 5 nm. For D ions the displacement damage for W ~ 0.20 dpa at 8 nm and Ta ~ 0.175 dpa at 7 nm. The mean implantation depth is same for W and Ta, i.e. for He ions ~ 40 nm and D ions ~ 70 nm. From these results, we conclude that retention of D is high than He ions, but damage is low for Ta as compared to W. Further investigation still in progress regarding W and T.Keywords: helium and deuterium ion impact, plasma facing components, SRIM simulation, tungsten, tantalum
Procedia PDF Downloads 1335075 Modeling and Characterization of Organic LED
Authors: Bouanati Sidi Mohammed, N. E. Chabane Sari, Mostefa Kara Selma
Abstract:
It is well-known that Organic light emitting diodes (OLEDs) are attracting great interest in the display technology industry due to their many advantages, such as low price of manufacturing, large-area of electroluminescent display, various colors of emission included white light. Recently, there has been much progress in understanding the device physics of OLEDs and their basic operating principles. In OLEDs, Light emitting is the result of the recombination of electron and hole in light emitting layer, which are injected from cathode and anode. For improve luminescence efficiency, it is needed that hole and electron pairs exist affluently and equally and recombine swiftly in the emitting layer. The aim of this paper is to modeling polymer LED and OLED made with small molecules for studying the electrical and optical characteristics. The first simulation structures used in this paper is a mono layer device; typically consisting of the poly (2-methoxy-5(2’-ethyl) hexoxy-phenylenevinylene) (MEH-PPV) polymer sandwiched between an anode usually an indium tin oxide (ITO) substrate, and a cathode, such as Al. In the second structure we replace MEH-PPV by tris (8-hydroxyquinolinato) aluminum (Alq3). We choose MEH-PPV because of it's solubility in common organic solvents, in conjunction with a low operating voltage for light emission and relatively high conversion efficiency and Alq3 because it is one of the most important host materials used in OLEDs. In this simulation, the Poole-Frenkel- like mobility model and the Langevin bimolecular recombination model have been used as the transport and recombination mechanism. These models are enabled in ATLAS -SILVACO software. The influence of doping and thickness on I(V) characteristics and luminescence, are reported.Keywords: organic light emitting diode, polymer lignt emitting diode, organic materials, hexoxy-phenylenevinylene
Procedia PDF Downloads 5555074 Mathematical Model to Quantify the Phenomenon of Democracy
Authors: Mechlouch Ridha Fethi
Abstract:
This paper presents a recent mathematical model in political sciences concerning democracy. The model is represented by a logarithmic equation linking the Relative Index of Democracy (RID) to Participation Ratio (PR). Firstly the meanings of the different parameters of the model were presented; and the variation curve of the RID according to PR with different critical areas was discussed. Secondly, the model was applied to a virtual group where we show that the model can be applied depending on the gender. Thirdly, it was observed that the model can be extended to different language models of democracy and that little use to assess the state of democracy for some International organizations like UNO.Keywords: democracy, mathematic, modelization, quantification
Procedia PDF Downloads 3705073 The Direct Deconvolutional Model in the Large-Eddy Simulation of Turbulence
Authors: Ning Chang, Zelong Yuan, Yunpeng Wang, Jianchun Wang
Abstract:
The utilization of Large Eddy Simulation (LES) has been extensive in turbulence research. LES concentrates on resolving the significant grid-scale motions while representing smaller scales through subfilter-scale (SFS) models. The deconvolution model, among the available SFS models, has proven successful in LES of engineering and geophysical flows. Nevertheless, the thorough investigation of how sub-filter scale dynamics and filter anisotropy affect SFS modeling accuracy remains lacking. The outcomes of LES are significantly influenced by filter selection and grid anisotropy, factors that have not been adequately addressed in earlier studies. This study examines two crucial aspects of LES: Firstly, the accuracy of direct deconvolution models (DDM) is evaluated concerning sub-filter scale (SFS) dynamics across varying filter-to-grid ratios (FGR) in isotropic turbulence. Various invertible filters are employed, including Gaussian, Helmholtz I and II, Butterworth, Chebyshev I and II, Cauchy, Pao, and rapidly decaying filters. The importance of FGR becomes evident as it plays a critical role in controlling errors for precise SFS stress prediction. When FGR is set to 1, the DDM models struggle to faithfully reconstruct SFS stress due to inadequate resolution of SFS dynamics. Notably, prediction accuracy improves when FGR is set to 2, leading to accurate reconstruction of SFS stress, except for cases involving Helmholtz I and II filters. Remarkably high precision, nearly 100%, is achieved at an FGR of 4 for all DDM models. Furthermore, the study extends to filter anisotropy and its impact on SFS dynamics and LES accuracy. By utilizing the dynamic Smagorinsky model (DSM), dynamic mixed model (DMM), and direct deconvolution model (DDM) with anisotropic filters, aspect ratios (AR) ranging from 1 to 16 are examined in LES filters. The results emphasize the DDM’s proficiency in accurately predicting SFS stresses under highly anisotropic filtering conditions. Notably high correlation coefficients exceeding 90% are observed in the a priori study for the DDM’s reconstructed SFS stresses, surpassing those of the DSM and DMM models. However, these correlations tend to decrease as filter anisotropy increases. In the a posteriori analysis, the DDM model consistently outperforms the DSM and DMM models across various turbulence statistics, including velocity spectra, probability density functions related to vorticity, SFS energy flux, velocity increments, strainrate tensors, and SFS stress. It is evident that as filter anisotropy intensifies, the results of DSM and DMM deteriorate, while the DDM consistently delivers satisfactory outcomes across all filter-anisotropy scenarios. These findings underscore the potential of the DDM framework as a valuable tool for advancing the development of sophisticated SFS models for LES in turbulence research.Keywords: deconvolution model, large eddy simulation, subfilter scale modeling, turbulence
Procedia PDF Downloads 785072 Study of Structural Behavior and Proton Conductivity of Inorganic Gel Paste Electrolyte at Various Phosphorous to Silicon Ratio by Multiscale Modelling
Authors: P. Haldar, P. Ghosh, S. Ghoshdastidar, K. Kargupta
Abstract:
In polymer electrolyte membrane fuel cells (PEMFC), the membrane electrode assembly (MEA) is consisting of two platinum coated carbon electrodes, sandwiched with one proton conducting phosphoric acid doped polymeric membrane. Due to low mechanical stability, flooding and fuel cell crossover, application of phosphoric acid in polymeric membrane is very critical. Phosphorous and silica based 3D inorganic gel gains the attention in the field of supercapacitors, fuel cells and metal hydrate batteries due to its thermally stable highly proton conductive behavior. Also as a large amount of water molecule and phosphoric acid can easily get trapped in Si-O-Si network cavities, it causes a prevention in the leaching out. In this study, we have performed molecular dynamics (MD) simulation and first principle calculations to understand the structural, electronics and electrochemical and morphological behavior of this inorganic gel at various P to Si ratios. We have used dipole-dipole interactions, H bonding, and van der Waals forces to study the main interactions between the molecules. A 'structure property-performance' mapping is initiated to determine optimum P to Si ratio for best proton conductivity. We have performed the MD simulations at various temperature to understand the temperature dependency on proton conductivity. The observed results will propose a model which fits well with experimental data and other literature values. We have also studied the mechanism behind proton conductivity. And finally we have proposed a structure for the gel paste with optimum P to Si ratio.Keywords: first principle calculation, molecular dynamics simulation, phosphorous and silica based 3D inorganic gel, polymer electrolyte membrane fuel cells, proton conductivity
Procedia PDF Downloads 1315071 An Event-Related Potential Investigation of Speech-in-Noise Recognition in Native and Nonnative Speakers of English
Authors: Zahra Fotovatnia, Jeffery A. Jones, Alexandra Gottardo
Abstract:
Speech communication often occurs in environments where noise conceals part of a message. Listeners should compensate for the lack of auditory information by picking up distinct acoustic cues and using semantic and sentential context to recreate the speaker’s intended message. This situation seems to be more challenging in a nonnative than native language. On the other hand, early bilinguals are expected to show an advantage over the late bilingual and monolingual speakers of a language due to their better executive functioning components. In this study, English monolingual speakers were compared with early and late nonnative speakers of English to understand speech in noise processing (SIN) and the underlying neurobiological features of this phenomenon. Auditory mismatch negativities (MMNs) were recorded using a double-oddball paradigm in response to a minimal pair that differed in their middle vowel (beat/bit) at Wilfrid Laurier University in Ontario, Canada. The results did not show any significant structural and electroneural differences across groups. However, vocabulary knowledge correlated positively with performance on tests that measured SIN processing in participants who learned English after age 6. Moreover, their performance on the test negatively correlated with the integral area amplitudes in the left superior temporal gyrus (STG). In addition, the STG was engaged before the inferior frontal gyrus (IFG) in noise-free and low-noise test conditions in all groups. We infer that the pre-attentive processing of words engages temporal lobes earlier than the fronto-central areas and that vocabulary knowledge helps the nonnative perception of degraded speech.Keywords: degraded speech perception, event-related brain potentials, mismatch negativities, brain regions
Procedia PDF Downloads 1095070 Energy Performance Gaps in Residences: An Analysis of the Variables That Cause Energy Gaps and Their Impact
Authors: Amrutha Kishor
Abstract:
Today, with the rising global warming and depletion of resources every industry is moving toward sustainability and energy efficiency. As part of this movement, it is nowadays obligatory for architects to play their part by creating energy predictions for their designs. But in a lot of cases, these predictions do not reflect the real quantities of energy in newly built buildings when operating. These can be described as ‘Energy Performance Gaps’. This study aims to determine the underlying reasons for these gaps. Seven houses designed by Allan Joyce Architects, UK from 1998 until 2019 were considered for this study. The data from the residents’ energy bills were cross-referenced with the predictions made with the software SefairaPro and from energy reports. Results indicated that the predictions did not match the actual energy usage. An account of how energy was used in these seven houses was made by means of personal interviews. The main factors considered in the study were occupancy patterns, heating systems and usage, lighting profile and usage, and appliances’ profile and usage. The study found that the main reasons for the creation of energy gaps were the discrepancies in occupant usage and patterns of energy consumption that are predicted as opposed to the actual ones. This study is particularly useful for energy-conscious architectural firms to fine-tune the approach to designing houses and analysing their energy performance. As the findings reveal that energy usage in homes varies based on the way residents use the space, it helps deduce the most efficient technological combinations. This information can be used to set guidelines for future policies and regulations related to energy consumption in homes. This study can also be used by the developers of simulation software to understand how architects use their product and drive improvements in its future versions.Keywords: architectural simulation, energy efficient design, energy performance gaps, environmental design
Procedia PDF Downloads 1205069 Collaboration During Planning and Reviewing in Writing: Effects on L2 Writing
Authors: Amal Sellami, Ahlem Ammar
Abstract:
Writing is acknowledged to be a cognitively demanding and complex task. Indeed, the writing process is composed of three iterative sub-processes, namely planning, translating (writing), and reviewing. Not only do second or foreign language learners need to write according to this process, but they also need to respect the norms and rules of language and writing in the text to-be-produced. Accordingly, researchers have suggested to approach writing as a collaborative task in order to al leviate its complexity. Consequently, collaboration has been implemented during the whole writing process or only during planning orreviewing. Researchers report that implementing collaboration during the whole process might be demanding in terms of time in comparison to individual writing tasks. Consequently, because of time constraints, teachers may avoid it. For this reason, it might be pedagogically more realistic to limit collaboration to one of the writing sub-processes(i.e., planning or reviewing). However, previous research implementing collaboration in planning or reviewing is limited and fails to explore the effects of the seconditionson the written text. Consequently, the present study examines the effects of collaboration in planning and collaboration in reviewing on the written text. To reach this objective, quantitative as well as qualitative methods are deployed to examine the written texts holistically and in terms of fluency, complexity, and accuracy. Participants of the study include 4 pairs in each group (n=8). They participated in two experimental conditions, which are: (1) collaborative planning followed by individual writing and individual reviewing and (2) individual planning followed by individual writing and collaborative reviewing. The comparative research findings indicate that while collaborative planning resulted in better overall text quality (precisely better content and organization ratings), better fluency, better complexity, and fewer lexical errors, collaborative reviewing produces better accuracy and less syntactical and mechanical errors. The discussion of the findings suggests the need to conduct more comparative research in order to further explore the effects of collaboration in planning or in reviewing. Pedagogical implications of the current study include advising teachers to choose between implementing collaboration in planning or in reviewing depending on their students’ need and what they need to improve.Keywords: collaboration, writing, collaborative planning, collaborative reviewing
Procedia PDF Downloads 1015068 Krembo Wings Youth Movement for Children with and without Disabilities: An Inclusive Model from an Educational Perspective to a Professional Approach
Authors: Claudia Koby, Merav Boaz, Meirav Zaiger Kober
Abstract:
Krembo Wings is an all-inclusive youth movement which brings children and youth with any disability together with their able-bodied peers (counselors) for weekly fun and educational social activities. Krembo Wings utilizes a socio-educational framework to create and lead social change through members with and without disabilities. All the work that Krembo Wings engages in stems from its central goal of promoting inclusion and integration using social and psychological theories to develop its unique model and approach. The key to Krembo Wings' approach in promoting inclusion is active participation – each member, with and without disabilities, is enabled to participate to their fullest capacity in the youth movement and its activities. In order for this to be achieved, all activities are adjustable and are modified to fit the abilities of each member. Additionally, youth counselors – most of whom are members without disabilities – go through extensive training in order to act as 'intermediaries' for their partner with disabilities, enabling and facilitating their partner's participation in a way that allows them to be as independent and active as possible. The relationship is one of friendship and not of caretaking. There is always a nurse on-hand to tend to any caretaking needs. Two essential elements of Krembo Wings' model is the broadening of concepts – shifting and changing the understanding of certain concepts such as what it means to be 'independent' or 'able' – and the development of a unique language – creating a language which both reflects and shapes reality. These elements of Krembo Wings' model foster the development of the values of acceptance and appreciation of those who are 'different'. It instills in members and counselors a new way of perceiving the world, one in which inclusion and integration are achievable and natural. Krembo Wings is certain that implementation of this model will promote the participation and inclusion of individuals with disabilities in society while promoting diversity. This model can serve as a platform which can be replicated and adjusted to suit any environment.Keywords: innovative model for inclusion, socio-educational movement, youth leadership, youth with and without disabilities
Procedia PDF Downloads 1285067 Alternative Robust Estimators for the Shape Parameters of the Burr XII Distribution
Authors: Fatma Zehra Doğru, Olcay Arslan
Abstract:
In this paper, we propose alternative robust estimators for the shape parameters of the Burr XII distribution. We provide a small simulation study and a real data example to illustrate the performance of the proposed estimators over the ML and the LS estimators.Keywords: burr xii distribution, robust estimator, m-estimator, least squares
Procedia PDF Downloads 4315066 Medical Workforce Knowledge of Adrenaline (Epinephrine) Administration in Anaphylaxis in Adults Considerably Improved with Training in an UK Hospital from 2010 to 2017
Authors: Jan C. Droste, Justine Burns, Nithin Narayan
Abstract:
Introduction: Life-threatening detrimental effects of inappropriate adrenaline (epinephrine) administration, e.g., by giving the wrong dose, in the context of anaphylaxis management is well documented in the medical literature. Half of the fatal anaphylactic reactions in the UK are iatrogenic, and the median time to a cardio-respiratory arrest can be as short as 5 minutes. It is therefore imperative that hospital doctors of all grades have active and accurate knowledge of the correct route, site, and dosage of administration of adrenaline. Given this time constraint and the potential fatal outcome with inappropriate management of anaphylaxis, it is alarming that surveys over the last 15 years have repeatedly shown only a minority of doctors to have accurate knowledge of adrenaline administration as recommended by the UK Resuscitation Council guidelines (2008 updated 2012). This comparison of survey results of the medical workforce over several years in a small NHS District General Hospital was conducted in order to establish the effect of the employment of multiple educational methods regarding adrenaline administration in anaphylaxis in adults. Methods: Between 2010 and 2017, several education methods and tools were used to repeatedly inform the medical workforce (doctors and advanced clinical practitioners) in a single district general hospital regarding the treatment of anaphylaxis in adults. Whilst the senior staff remained largely the same cohort, junior staff had changed fully in every survey. Examples included: (i) Formal teaching -in Grand Rounds; during the junior doctors’ induction process; advanced life support courses (ii) In-situ simulation training performed by the clinical skills simulation team –several ad hoc sessions and one 3-day event in 2017 visiting 16 separate clinical areas performing an acute anaphylaxis scenario using actors- around 100 individuals from multi-disciplinary teams were involved (iii) Hospital-wide distribution of the simulation event via the Trust’s Simulation Newsletter (iv) Laminated algorithms were attached to the 'crash trolleys' (v) A short email 'alert' was sent to all medical staff 3 weeks prior to the survey detailing the emergency treatment of anaphylaxis (vi) In addition, the performance of the surveys themselves represented a teaching opportunity when gaps in knowledge could be addressed. Face to face surveys were carried out in 2010 ('pre-intervention), 2015, and 2017, in the latter two occasions including advanced clinical practitioners (ACP). All surveys consisted of convenience samples. If verbal consent to conduct the survey was obtained, the medical practitioners' answers were recorded immediately on a data collection sheet. Results: There was a sustained improvement in the knowledge of the medical workforce from 2010 to 2017: Answers improved regarding correct drug by 11% (84%, 95%, and 95%); the correct route by 20% (76%, 90%, and 96%); correct site by 40% (43%, 83%, and 83%) and the correct dose by 45% (27%, 54%, and 72%). Overall, knowledge of all components -correct drug, route, site, and dose-improved from 13% in 2010 to 62% in 2017. Conclusion: This survey comparison shows knowledge of the medical workforce regarding adrenaline administration for treatment of anaphylaxis in adults can be considerably improved by employing a variety of educational methods.Keywords: adrenaline, anaphylaxis, epinephrine, medical education, patient safety
Procedia PDF Downloads 1305065 Field-observed Thermal Fractures during Reinjection and Its Numerical Simulation
Authors: Wen Luo, Phil J. Vardon, Anne-Catherine Dieudonne
Abstract:
One key process that partly controls the success of geothermal projects is fluid reinjection, which benefits in dealing with waste water, maintaining reservoir pressure, and supplying heat-exchange media, etc. Thus, sustaining the injectivity is of great importance for the efficiency and sustainability of geothermal production. However, the injectivity is sensitive to the reinjection process. Field experiences have illustrated that the injectivity can be damaged or improved. In this paper, the focus is on how the injectivity is improved. Since the injection pressure is far below the formation fracture pressure, hydraulic fracturing cannot be the mechanism contributing to the increase in injectivity. Instead, thermal stimulation has been identified as the main contributor to improving the injectivity. For low-enthalpy geothermal reservoirs, which are not fracture-controlled, thermal fracturing, instead of thermal shearing, is expected to be the mechanism for increasing injectivity. In this paper, field data from the sedimentary low-enthalpy geothermal reservoirs in the Netherlands were analysed to show the occurrence of thermal fracturing due to the cooling shock during reinjection. Injection data were collected and compared to show the effects of the thermal fractures on injectivity. Then, a thermo-hydro-mechanical (THM) model for the near field formation was developed and solved by finite element method to simulate the observed thermal fractures. It was then compared with the HM model, decomposed from the THM model, to illustrate the thermal effects on thermal fracturing. Finally, the effects of operational parameters, i.e. injection temperature and pressure, on the changes in injectivity were studied on the basis of the THM model. The field data analysis and simulation results illustrate that the thermal fracturing occurred during reinjection and contributed to the increase in injectivity. The injection temperature was identified as a key parameter that contributes to thermal fracturing.Keywords: injectivity, reinjection, thermal fracturing, thermo-hydro-mechanical model
Procedia PDF Downloads 2195064 Ethanol Chlorobenzene Dosimetr Usage for Measuring Dose of the Intraoperative Linear Electron Accelerator System
Authors: Mojtaba Barzegar, Alireza Shirazi, Saied Rabi Mahdavi
Abstract:
Intraoperative radiation therapy (IORT) is an innovative treatment modality that the delivery of a large single dose of radiation to the tumor bed during the surgery. The radiotherapy success depends on the absorbed dose delivered to the tumor. The achievement better accuracy in patient treatment depends upon the measured dose by standard dosimeter such as ionization chamber, but because of the high density of electric charge/pulse produced by the accelerator in the ionization chamber volume, the standard correction factor for ion recombination Ksat calculated with the classic two-voltage method is overestimated so the use of dose/pulse independent dosimeters such as chemical Fricke and ethanol chlorobenzene (ECB) dosimeters have been suggested. Dose measurement is usually calculated and calibrated in the Zmax. Ksat calculated by comparison of ion chamber response and ECB dosimeter at each applicator degree, size, and dose. The relative output factors for IORT applicators have been calculated and compared with experimentally determined values and the results simulated by Monte Carlo software. The absorbed doses have been calculated and measured with statistical uncertainties less than 0.7% and 2.5% consecutively. The relative differences between calculated and measured OF’s were up to 2.5%, for major OF’s the agreement was better. In these conditions, together with the relative absorbed dose calculations, the OF’s could be considered as an indication that the IORT electron beams have been well simulated. These investigations demonstrate the utility of the full Monte Carlo simulation of accelerator head with ECB dosimeter allow us to obtain detailed information of clinical IORT beams.Keywords: intra operative radiotherapy, ethanol chlorobenzene, ksat, output factor, monte carlo simulation
Procedia PDF Downloads 4805063 Quantitative, Preservative Methodology for Review of Interview Transcripts Using Natural Language Processing
Authors: Rowan P. Martnishn
Abstract:
During the execution of a National Endowment of the Arts grant, approximately 55 interviews were collected from professionals across various fields. These interviews were used to create deliverables – historical connections for creations that began as art and evolved entirely into computing technology. With dozens of hours’ worth of transcripts to be analyzed by qualitative coders, a quantitative methodology was created to sift through the documents. The initial step was to both clean and format all the data. First, a basic spelling and grammar check was applied, as well as a Python script for normalized formatting which used an open-source grammatical formatter to make the data as coherent as possible. 10 documents were randomly selected to manually review, where words often incorrectly translated during the transcription were recorded and replaced throughout all other documents. Then, to remove all banter and side comments, the transcripts were spliced into paragraphs (separated by change in speaker) and all paragraphs with less than 300 characters were removed. Secondly, a keyword extractor, a form of natural language processing where significant words in a document are selected, was run on each paragraph for all interviews. Every proper noun was put into a data structure corresponding to that respective interview. From there, a Bidirectional and Auto-Regressive Transformer (B.A.R.T.) summary model was then applied to each paragraph that included any of the proper nouns selected from the interview. At this stage the information to review had been sent from about 60 hours’ worth of data to 20. The data was further processed through light, manual observation – any summaries which proved to fit the criteria of the proposed deliverable were selected, as well their locations within the document. This narrowed that data down to about 5 hours’ worth of processing. The qualitative researchers were then able to find 8 more connections in addition to our previous 4, exceeding our minimum quota of 3 to satisfy the grant. Major findings of the study and subsequent curation of this methodology raised a conceptual finding crucial to working with qualitative data of this magnitude. In the use of artificial intelligence there is a general trade off in a model between breadth of knowledge and specificity. If the model has too much knowledge, the user risks leaving out important data (too general). If the tool is too specific, it has not seen enough data to be useful. Thus, this methodology proposes a solution to this tradeoff. The data is never altered outside of grammatical and spelling checks. Instead, the important information is marked, creating an indicator of where the significant data is without compromising the purity of it. Secondly, the data is chunked into smaller paragraphs, giving specificity, and then cross-referenced with the keywords (allowing generalization over the whole document). This way, no data is harmed, and qualitative experts can go over the raw data instead of using highly manipulated results. Given the success in deliverable creation as well as the circumvention of this tradeoff, this methodology should stand as a model for synthesizing qualitative data while maintaining its original form.Keywords: B.A.R.T.model, keyword extractor, natural language processing, qualitative coding
Procedia PDF Downloads 325062 Assessing the Structure of Non-Verbal Semantic Knowledge: The Evaluation and First Results of the Hungarian Semantic Association Test
Authors: Alinka Molnár-Tóth, Tímea Tánczos, Regina Barna, Katalin Jakab, Péter Klivényi
Abstract:
Supported by neuroscientific findings, the so-called Hub-and-Spoke model of the human semantic system is based on two subcomponents of semantic cognition, namely the semantic control process and semantic representation. Our semantic knowledge is multimodal in nature, as the knowledge system stored in relation to a conception is extensive and broad, while different aspects of the conception may be relevant depending on the purpose. The motivation of our research is to develop a new diagnostic measurement procedure based on the preservation of semantic representation, which is appropriate to the specificities of the Hungarian language and which can be used to compare the non-verbal semantic knowledge of healthy and aphasic persons. The development of the test will broaden the Hungarian clinical diagnostic toolkit, which will allow for more specific therapy planning. The sample of healthy persons (n=480) was determined by the last census data for the representativeness of the sample. Based on the concept of the Pyramids and Palm Tree Test, and according to the characteristics of the Hungarian language, we have elaborated a test based on different types of semantic information, in which the subjects are presented with three pictures: they have to choose the one that best fits the target word above from the two lower options, based on the semantic relation defined. We have measured 5 types of semantic knowledge representations: associative relations, taxonomy, motional representations, concrete as well as abstract verbs. As the first step in our data analysis, we examined the normal distribution of our results, and since it was not normally distributed (p < 0.05), we used nonparametric statistics further into the analysis. Using descriptive statistics, we could determine the frequency of the correct and incorrect responses, and with this knowledge, we could later adjust and remove the items of questionable reliability. The reliability was tested using Cronbach’s α, and it can be safely said that all the results were in an acceptable range of reliability (α = 0.6-0.8). We then tested for the potential gender differences using the Mann Whitney-U test, however, we found no difference between the two (p < 0.05). Likewise, we didn’t see that the age had any effect on the results using one-way ANOVA (p < 0.05), however, the level of education did influence the results (p > 0.05). The relationships between the subtests were observed by the nonparametric Spearman’s rho correlation matrix, showing statistically significant correlation between the subtests (p > 0.05), signifying a linear relationship between the measured semantic functions. A margin of error of 5% was used in all cases. The research will contribute to the expansion of the clinical diagnostic toolkit and will be relevant for the individualised therapeutic design of treatment procedures. The use of a non-verbal test procedure will allow an early assessment of the most severe language conditions, which is a priority in the differential diagnosis. The measurement of reaction time is expected to advance prodrome research, as the tests can be easily conducted in the subclinical phase.Keywords: communication disorders, diagnostic toolkit, neurorehabilitation, semantic knowlegde
Procedia PDF Downloads 1055061 Detection of Clipped Fragments in Speech Signals
Authors: Sergei Aleinik, Yuri Matveev
Abstract:
In this paper a novel method for the detection of clipping in speech signals is described. It is shown that the new method has better performance than known clipping detection methods, is easy to implement, and is robust to changes in signal amplitude, size of data, etc. Statistical simulation results are presented.Keywords: clipping, clipped signal, speech signal processing, digital signal processing
Procedia PDF Downloads 3955060 Exploring Academic Writing Challenges of First Year English as an Additional Language Students at an ODeL Institution in South Africa
Authors: Tumelo Jaquiline Ntsopi
Abstract:
This study explored the academic writing challenges of first-year students who use English as an Additional Language (EAL) registered in the EAW101 module at an ODeL institution. Research shows that academic writing is a challenge for EAL teaching and learning contexts across the globe in higher education institutions (HEIs). Academic writing is an important aspect of academic literacy in any institution of higher learning, more so in an ODeL institution. This has probed research that shows that academic writing is and continues to pose challenges for EAL teaching and learning contexts in higher education institutions. This study stems from the researcher’s experience in teaching academic writing to first-year students in the EAW101 module. The motivation for this study emerged from the fact that EAW101 is a writing module that has a high number of students in the Department of English Studies with an average of between 50-80 percent pass rate. These statistics elaborate on the argument that most students registered in this module struggle with academic writing, and they need intervention to assist and support them in achieving competence in the module. This study is underpinned by Community of Inquiry (CoI) framework and Transactional distance theory. This study adopted a qualitative research methodology and utilised a case study approach as a research design. Furthermore, the study gathered data from first year students and the EAW101 module’s student support initiatives. To collect data, focus group discussions, structured open-ended evaluation questions, and an observation schedule were used to gather data. The study is vital towards exploring academic writing challenges that first-year students in EAW101 encounter so that lecturers in the module may consider re-evaluating their methods of teaching to improve EAL students’ academic writing skills. This study may help lecturers towards enhancing academic writing in a ODeL context by assisting first year students through using student support interventions.Keywords: academic writing, academic writing challenge, ODeL, EAL
Procedia PDF Downloads 1085059 Simulation Of A Renal Phantom Using the MAG 3
Authors: Ati Moncef
Abstract:
We describe in this paper the results of a phantom of dynamics renal with MAG3. Our phantom consisted of (tow shaped of kidneys, 1 liver). These phantoms were scanned with static and dynamic protocols and compared with clinical data. in a normal conditions we use our phantoms it's possible to acquire a renal images when we can be compared with clinical scintigraphy. In conclusion, Renal phantom also can use in the quality control of a renal scintigraphy.Keywords: Renal scintigraphy, MAG3, Nuclear medicine, Gamma Camera.
Procedia PDF Downloads 4025058 Modeling Revolution Shell Structures by MATLAB Programming-Axisymmetric and Nonaxisymmetric Shells
Authors: Hamadi Djamal, Labiodh Bachir, Ounis Abdelhafid, Chaalane Mourad
Abstract:
The objective of this work is setting numerically operational finite element CAXI_L for the axisymmetric and nonaxisymmetric shells. This element is based on the Reissner-Mindlin theory and mixed model formulation. The MATLAB language is used for the programming. In order to test the elaborated program, some applications are carried out.Keywords: axisymmetric shells, nonaxisymmetric behaviour, finite element, MATLAB programming
Procedia PDF Downloads 3215057 The Impact of Sedimentary Heterogeneity on Oil Recovery in Basin-plain Turbidite: An Outcrop Analogue Simulation Case Study
Authors: Bayonle Abiola Omoniyi
Abstract:
In turbidite reservoirs with volumetrically significant thin-bedded turbidites (TBTs), thin-pay intervals may be underestimated during calculation of reserve volume due to poor vertical resolution of conventional well logs. This paper demonstrates the strong control of bed-scale sedimentary heterogeneity on oil recovery using six facies distribution scenarios that were generated from outcrop data from the Eocene Itzurun Formation, Basque Basin (northern Spain). The variable net sand volume in these scenarios serves as a primary source of sedimentary heterogeneity impacting sandstone-mudstone ratio, sand and shale geometry and dimensions, lateral and vertical variations in bed thickness, and attribute indices. The attributes provided input parameters for modeling the scenarios. The models are 20-m (65.6 ft) thick. Simulation of the scenarios reveals that oil production is markedly enhanced where degree of sedimentary heterogeneity and resultant permeability contrast are low, as exemplified by Scenarios 1, 2, and 3. In these scenarios, bed architecture encourages better apparent vertical connectivity across intervals of laterally continuous beds. By contrast, low net-to-gross Scenarios 4, 5, and 6, have rapidly declining oil production rates and higher water cut with more oil effectively trapped in low-permeability layers. These scenarios may possess enough lateral connectivity to enable injected water to sweep oil to production well; such sweep is achieved at a cost of high-water production. It is therefore imperative to consider not only net-to-gross threshold but also facies stack pattern and related attribute indices to better understand how to effectively manage water production for optimum oil recovery from basin-plain reservoirs.Keywords: architecture, connectivity, modeling, turbidites
Procedia PDF Downloads 285056 Experimental Study and Numerical Simulation of the Reaction and Flow on the Membrane Wall of Entrained Flow Gasifier
Authors: Jianliang Xu, Zhenghua Dai, Zhongjie Shen, Haifeng Liu, Fuchen Wang
Abstract:
In an entrained flow gasifier, the combustible components are converted into the gas phase, and the mineral content is converted into ash. Most of the ash particles or droplets are deposited on the refractory or membrane wall and form a slag layer that flows down to the quenching system. The captured particle reaction process and slag flow and phase transformation play an important role in gasifier performance and safe and stable operation. The reaction characteristic of captured char particles on the molten slag had been studied by applied a high-temperature stage microscope. The gasification process of captured chars with CO2 on the slag surface was observed and recorded, compared to the original char gasification. The particle size evolution, heat transfer process are discussed, and the gasification reaction index of the capture char particle are modeled. Molten slag layer promoted the char reactivity from the analysis of reaction index, Coupled with heat transfer analysis, shrinking particle model (SPM) was applied and modified to predict the gasification time at carbon conversion of 0.9, and results showed an agreement with the experimental data. A comprehensive model with gas-particle-slag flow and reaction models was used to model the different industry gasifier. The carbon conversion information in the spatial space and slag layer surface are investigated. The slag flow characteristic, such as slag velocity, molten slag thickness, slag temperature distribution on the membrane wall and refractory brick are discussed.Keywords: char, slag, numerical simulation, gasification, wall reaction, membrane wall
Procedia PDF Downloads 310