Search results for: grammatical units
1160 Understanding the Processwise Entropy Framework in a Heat-powered Cooling Cycle
Authors: P. R. Chauhan, S. K. Tyagi
Abstract:
Adsorption refrigeration technology offers a sustainable and energy-efficient cooling alternative over traditional refrigeration technologies for meeting the fast-growing cooling demands. With its ability to utilize natural refrigerants, low-grade heat sources, and modular configurations, it has the potential to revolutionize the cooling industry. Despite these benefits, the commercial viability of this technology is hampered by several fundamental limiting constraints, including its large size, low uptake capacity, and poor performance as a result of deficient heat and mass transfer characteristics. The primary cause of adequate heat and mass transfer characteristics and magnitude of exergy loss in various real processes of adsorption cooling system can be assessed by the entropy generation rate analysis, i. e. Second law of Thermodynamics. Therefore, this article presents the second law of thermodynamic-based investigation in terms of entropy generation rate (EGR) to identify the energy losses in various processes of the HPCC-based adsorption system using MATLAB R2021b software. The adsorption technology-based cooling system consists of two beds made up of silica gel and arranged in a single stage, while the water is employed as a refrigerant, coolant, and hot fluid. The variation in process-wise EGR is examined corresponding to cycle time, and a comparative analysis is also presented. Moreover, the EGR is also evaluated in the external units, such as the heat source and heat sink unit used for regeneration and heat dump, respectively. The research findings revealed that the combination of adsorber and desorber, which operates across heat reservoirs with a higher temperature gradient, shares more than half of the total amount of EGR. Moreover, the EGR caused by the heat transfer process is determined to be the highest, followed by a heat sink, heat source, and mass transfer, respectively. in case of heat transfer process, the operation of the valve is determined to be responsible for more than half (54.9%) of the overall EGR during the heat transfer. However, the combined contribution of the external units, such as the source (18.03%) and sink (21.55%), to the total EGR, is 35.59%. The analysis and findings of the present research are expected to pinpoint the source of the energy waste in HPCC based adsorption cooling systems.Keywords: adsorption cooling cycle, heat transfer, mass transfer, entropy generation, silica gel-water
Procedia PDF Downloads 1061159 Utility of Thromboelastography to Reduce Coagulation-Related Mortality and Blood Component Rate in Neurosurgery ICU
Authors: Renu Saini, Deepak Agrawal
Abstract:
Background: Patients with head and spinal cord injury frequently have deranged coagulation profiles and require blood products transfusion perioperatively. Thromboelastography (TEG) is a ‘bedside’ global test of coagulation which may have role in deciding the need of transfusion in such patients. Aim: To assess the usefulness of TEG in department of neurosurgery in decreasing transfusion rates and coagulation-related mortality in traumatic head and spinal cord injury. Method and Methodology: A retrospective comparative study was carried out in the department of neurosurgery over a period of 1 year. There are two groups in this study. ‘Control’ group constitutes the patients in whom data was collected over 6 months (1/6/2009-31/12/2009) prior to installation of TEG machine. ‘Test’ group includes patients in whom data was collected over 6months (1/1/2013-30/6/2013) post TEG installation. Total no. of platelet, FFP, and cryoprecipitate transfusions were noted in both groups along with in hospital mortality and length of stay. Result: Both groups were matched in age and sex of patients, number of head and spinal cord injury cases, number of patients with thrombocytopenia and number of patients who underwent operation. Total 178 patients (135 head injury and 43 spinal cord injury patents) were admitted in neurosurgery department during time period June 2009 to December 2009 i.e. prior to TEG installation and after TEG installation a total of 243 patients(197 head injury and 46 spinal cord injury patents) were admitted. After TEG introduction platelet transfusion significantly reduced (p=0.000) compare to control group (67 units to 34 units). Mortality rate was found significantly reduced after installation (77 patients to 57 patients, P=0.000). Length of stay was reduced significantly (Prior installation 1-211days and after installation 1-115days, p=0.02). Conclusion: Bedside TEG can dramatically reduce platelet transfusion components requirement in department of neurosurgery. TEG also lead to a drastic decrease in mortality rate and length of stay in patients with traumatic head and spinal cord injuries. We recommend its use as a standard of care in the patients with traumatic head and spinal cord injuries.Keywords: blood component transfusion, mortality, neurosurgery ICU, thromboelastography
Procedia PDF Downloads 3241158 Flow Sheet Development and Simulation of a Bio-refinery Annexed to Typical South African Sugar Mill
Authors: M. Ali Mandegari, S. Farzad, J. F. Görgens
Abstract:
Sugar is one of the main agricultural industries in South Africa and approximately livelihoods of one million South Africans are indirectly dependent on sugar industry which is economically struggling with some problems and should re-invent in order to ensure a long-term sustainability. Second generation bio-refinery is defined as a process to use waste fibrous for the production of bio-fuel, chemicals animal food, and electricity. Bio-ethanol is by far the most widely used bio-fuel for transportation worldwide and many challenges in front of bio-ethanol production were solved. Bio-refinery annexed to the existing sugar mill for production of bio-ethanol and electricity is proposed to sugar industry and is addressed in this study. Since flow-sheet development is the key element of the bio-ethanol process, in this work, a bio-refinery (bio-ethanol and electricity production) annexed to a typical South African sugar mill considering 65ton/h dry sugarcane bagasse and tops/trash as feedstock was simulated. Aspen PlusTM V8.6 was applied as simulator and realistic simulation development approach was followed to reflect the practical behavior of the plant. Latest results of other researches considering pretreatment, hydrolysis, fermentation, enzyme production, bio-ethanol production and other supplementary units such as evaporation, water treatment, boiler, and steam/electricity generation units were adopted to establish a comprehensive bio-refinery simulation. Steam explosion with SO2 was selected for pretreatment due to minimum inhibitor production and simultaneous saccharification and fermentation (SSF) configuration was adopted for enzymatic hydrolysis and fermentation of cellulose and hydrolyze. Bio-ethanol purification was simulated by two distillation columns with side stream and fuel grade bio-ethanol (99.5%) was achieved using molecular sieve in order to minimize the capital and operating costs. Also boiler and steam/power generation were completed using industrial design data. Results indicates 256.6 kg bio ethanol per ton of feedstock and 31 MW surplus power were attained from bio-refinery while the process consumes 3.5, 3.38, and 0.164 (GJ/ton per ton of feedstock) hot utility, cold utility and electricity respectively. Developed simulation is a threshold of variety analyses and developments for further studies.Keywords: bio-refinery, bagasse, tops, trash, bio-ethanol, electricity
Procedia PDF Downloads 5311157 X-Ray Photoelectron Spectroscopy Analyses of Candidate Materials for Advanced Nuclear Reactors
Authors: Marie Kudrnová, Jana Rejková
Abstract:
The samples of supplied INCONEL 601, 617, 625, and HASTELLOY C-22 alloys and experimental nickel alloy MoNiCr were examined by XPS (X-ray photoelectron spectroscopy) before and after exposure. The experiment was performed in a mixture of LiCl-KCl salt (58.2-41.8 wt. %). The exposure conditions were 440°C, pressure 0.2 MPa, 500 hours in an inert argon atmosphere. The XPS analysis shows that a thin oxide layer composed of metal oxides such as NiO, Cr₂O₃, and Nb₂O₅ was formed. After sputtering the exposed surface with Ar ions, metals were also detected in the elemental state, indicating a very thin protective oxide layer with a thickness in units of up to tens of nanometers.Keywords: XPS, MSR, nickel alloy, metal oxides
Procedia PDF Downloads 751156 Gas Flotation Unit in Kuwait Oil Company Operations
Authors: Homoud Bourisli, Haitham Safar
Abstract:
Oil is one of main resources of energy in the world. As conventional oil is drying out, oil recovery is crucial to maintain the same level of oil production. Since water injection is one of the commonly used methods to increase and maintain pressure in oil wells, oil-water separation processes of the water associated with oil production for water injection oil recovery is very essential. Therefore, Gas Flotation Units are used for oil-water separation to be able to re-inject the treated water back into the wells to increase pressure.Keywords: Kuwait oil company, dissolved gas flotation unit, induced gas flotation unit, oil-water separation
Procedia PDF Downloads 5731155 Territorialisation and Elections: Land and Politics in Benin
Authors: Kamal Donko
Abstract:
In the frontier zone of Benin Republic, land seems to be a fundamental political resource as it is used as a tool for socio-political mobilization, blackmail, inclusion and exclusion, conquest and political control. This paper seeks to examine the complex and intriguing interlinks between land, identity and politics in central Benin. It aims to investigate what roles territorialisation and land ownership are playing in the electioneering process in central Benin. It employs ethnographic multi-sited approach to data collections including observations, interviews and focused group discussions. Research findings reveal a complex and intriguing relationship between land ownership and politics in central Benin. Land is found to be playing a key role in the electioneering process in the region. The study has also discovered many emerging socio-spatial patterns of controlling and maintaining political power in the zone which are tied to land politics. These include identity reconstruction and integration mechanism through intermarriages, socio-political initiatives and construction of infrastructure of sovereignty. It was also found that ‘Diaspora organizations’ and identity issues; strategic creation of administrative units; alliance building strategy; gerrymandering local political field, etc. These emerging socio-spatial patterns of territorialisation for maintaining political power affect migrant and native communities’ relationships. It was also found that ‘Diaspora organizations’ and identity issues; strategic creation of administrative units; alliance building strategy; gerrymandering local political field, etc. are currently affecting migrant’s and natives’ relationships. The study argues that territorialisation is not only about national boundaries and the demarcation between different nation states, but more importantly, it serves as a powerful tool of domination and political control at the grass root level. Furthermore, this study seems to provide another perspective from which the political situation in Africa can be studied. Investigating how the dynamics of land ownership is influencing politics at the grass root or micro level, this study is fundamental to understanding spatial issues in the frontier zone.Keywords: land, migration, politics, territorialisation
Procedia PDF Downloads 3601154 An Optimal Hybrid EMS System for a Hyperloop Prototype Vehicle
Authors: J. F. Gonzalez-Rojo, Federico Lluesma-Rodriguez, Temoatzin Gonzalez
Abstract:
Hyperloop, a new mode of transport, is gaining significance. It consists of the use of a ground-based transport system which includes a levitation system, that avoids rolling friction forces, and which has been covered with a tube, controlling the inner atmosphere lowering the aerodynamic drag forces. Thus, hyperloop is proposed as a solution to the current limitation on ground transportation. Rolling and aerodynamic problems, that limit large speeds for traditional high-speed rail or even maglev systems, are overcome using a hyperloop solution. Zeleros is one of the companies developing technology for hyperloop application worldwide. It is working on a concept that reduces the infrastructure cost and minimizes the power consumption as well as the losses associated with magnetic drag forces. For this purpose, Zeleros proposes a Hybrid ElectroMagnetic Suspension (EMS) for its prototype. In the present manuscript an active and optimal electromagnetic suspension levitation method based on nearly zero power consumption individual modules is presented. This system consists of several hybrid permanent magnet-coil levitation units that can be arranged along the vehicle. The proposed unit manages to redirect the magnetic field along a defined direction forming a magnetic circuit and minimizing the loses due to field dispersion. This is achieved using an electrical steel core. Each module can stabilize the gap distance using the coil current and either linear or non-linear control methods. The ratio between weight and levitation force for each unit is 1/10. In addition, the quotient between the lifted weight and power consumption at the target gap distance is 1/3 [kg/W]. One degree of freedom (DoF) (along the gap direction) is controlled by a single unit. However, when several units are present, a 5 DoF control (2 translational and 3 rotational) can be achieved, leading to the full attitude control of the vehicle. The proposed system has been successfully tested reaching TRL-4 in a laboratory test bench and is currently in TRL-5 state development if the module association in order to control 5 DoF is considered.Keywords: active optimal control, electromagnetic levitation, HEMS, high-speed transport, hyperloop
Procedia PDF Downloads 1441153 Portable and Parallel Accelerated Development Method for Field-Programmable Gate Array (FPGA)-Central Processing Unit (CPU)- Graphics Processing Unit (GPU) Heterogeneous Computing
Authors: Nan Hu, Chao Wang, Xi Li, Xuehai Zhou
Abstract:
The field-programmable gate array (FPGA) has been widely adopted in the high-performance computing domain. In recent years, the embedded system-on-a-chip (SoC) contains coarse granularity multi-core CPU (central processing unit) and mobile GPU (graphics processing unit) that can be used as general-purpose accelerators. The motivation is that algorithms of various parallel characteristics can be efficiently mapped to the heterogeneous architecture coupled with these three processors. The CPU and GPU offload partial computationally intensive tasks from the FPGA to reduce the resource consumption and lower the overall cost of the system. However, in present common scenarios, the applications always utilize only one type of accelerator because the development approach supporting the collaboration of the heterogeneous processors faces challenges. Therefore, a systematic approach takes advantage of write-once-run-anywhere portability, high execution performance of the modules mapped to various architectures and facilitates the exploration of design space. In this paper, A servant-execution-flow model is proposed for the abstraction of the cooperation of the heterogeneous processors, which supports task partition, communication and synchronization. At its first run, the intermediate language represented by the data flow diagram can generate the executable code of the target processor or can be converted into high-level programming languages. The instantiation parameters efficiently control the relationship between the modules and computational units, including two hierarchical processing units mapping and adjustment of data-level parallelism. An embedded system of a three-dimensional waveform oscilloscope is selected as a case study. The performance of algorithms such as contrast stretching, etc., are analyzed with implementations on various combinations of these processors. The experimental results show that the heterogeneous computing system with less than 35% resources achieves similar performance to the pure FPGA and approximate energy efficiency.Keywords: FPGA-CPU-GPU collaboration, design space exploration, heterogeneous computing, intermediate language, parameterized instantiation
Procedia PDF Downloads 1161152 Q-Map: Clinical Concept Mining from Clinical Documents
Authors: Sheikh Shams Azam, Manoj Raju, Venkatesh Pagidimarri, Vamsi Kasivajjala
Abstract:
Over the past decade, there has been a steep rise in the data-driven analysis in major areas of medicine, such as clinical decision support system, survival analysis, patient similarity analysis, image analytics etc. Most of the data in the field are well-structured and available in numerical or categorical formats which can be used for experiments directly. But on the opposite end of the spectrum, there exists a wide expanse of data that is intractable for direct analysis owing to its unstructured nature which can be found in the form of discharge summaries, clinical notes, procedural notes which are in human written narrative format and neither have any relational model nor any standard grammatical structure. An important step in the utilization of these texts for such studies is to transform and process the data to retrieve structured information from the haystack of irrelevant data using information retrieval and data mining techniques. To address this problem, the authors present Q-Map in this paper, which is a simple yet robust system that can sift through massive datasets with unregulated formats to retrieve structured information aggressively and efficiently. It is backed by an effective mining technique which is based on a string matching algorithm that is indexed on curated knowledge sources, that is both fast and configurable. The authors also briefly examine its comparative performance with MetaMap, one of the most reputed tools for medical concepts retrieval and present the advantages the former displays over the latter.Keywords: information retrieval, unified medical language system, syntax based analysis, natural language processing, medical informatics
Procedia PDF Downloads 1331151 The Evolution and Driving Forces Analysis of Urban Spatial Pattern in Tibet Based on Archetype Theory
Authors: Qiuyu Chen, Bin Long, Junxi Yang
Abstract:
Located in the southwest of the "roof of the world", Tibet is the origin center of Tibetan Culture.Lhasa, Shigatse and Gyantse are three famous historical and cultural cities in Tibet. They have always been prominent political, economic and cultural cities, and have accumulated the unique aesthetic orientation and value consciousness of Tibet's urban construction. "Archetype" usually refers to the theoretical origin of things, which is the collective unconscious precipitation. The archetype theory fundamentally explores the dialectical relationship between image expression, original form and behavior mode. By abstracting and describing typical phenomena or imagery of the archetype object can observe the essence of objects, explore ways in which object phenomena arise. Applying archetype theory to the field of urban planning helps to gain insight, evaluation, and restructuring of the complex and ever-changing internal structural units of cities. According to existing field investigations, it has been found that Dzong, Temple, Linka and traditional residential systems are important structural units that constitute the urban space of Lhasa, Shigatse and Gyantse. This article applies the thinking method of archetype theory, starting from the imagery expression of urban spatial pattern, using technologies such as ArcGIS, Depthmap, and Computer Vision to descriptively identify the spatial representation and plane relationship of three cities through remote sensing images and historical maps. Based on historical records, the spatial characteristics of cities in different historical periods are interpreted in a hierarchical manner, attempting to clarify the origin of the formation and evolution of urban pattern imagery from the perspectives of geopolitical environment, social structure, religious theory, etc, and expose the growth laws and key driving forces of cities. The research results can provide technical and material support for important behaviors such as urban restoration, spatial intervention, and promoting transformation in the region.Keywords: archetype theory, urban spatial imagery, original form and pattern, behavioral driving force, Tibet
Procedia PDF Downloads 641150 Developing Oral Communication Competence in a Second Language: The Communicative Approach
Authors: Ikechi Gilbert
Abstract:
Oral communication is the transmission of ideas or messages through the speech process. Acquiring competence in this area which, by its volatile nature, is prone to errors and inaccuracies would require the adoption of a well-suited teaching methodology. Efficient oral communication facilitates exchange of ideas and easy accomplishment of day-to-day tasks, by means of a demonstrated mastery of oral expression and the making of fine presentations to audiences or individuals while recognizing verbal signals and body language of others and interpreting them correctly. In Anglophone states such as Nigeria, Ghana, etc., the French language, for instance, is studied as a foreign language, being used majorly in teaching learners who have their own mother tongue different from French. The same applies to Francophone states where English is studied as a foreign language by people whose official language or mother tongue is different from English. The ideal approach would be to teach these languages in these environments through a pedagogical approach that properly takes care of the oral perspective for effective understanding and application by the learners. In this article, we are examining the communicative approach as a methodology for teaching oral communication in a foreign language. This method is a direct response to the communicative needs of the learner involving the use of appropriate materials and teaching techniques that meet those needs. It is also a vivid improvement to the traditional grammatical and audio-visual adaptations. Our contribution will focus on the pedagogical component of oral communication improvement, highlighting its merits and also proposing diverse techniques including aspects of information and communication technology that would assist the second language learner communicate better orally.Keywords: communication, competence, methodology, pedagogical component
Procedia PDF Downloads 2641149 Classifying and Predicting Efficiencies Using Interval DEA Grid Setting
Authors: Yiannis G. Smirlis
Abstract:
The classification and the prediction of efficiencies in Data Envelopment Analysis (DEA) is an important issue, especially in large scale problems or when new units frequently enter the under-assessment set. In this paper, we contribute to the subject by proposing a grid structure based on interval segmentations of the range of values for the inputs and outputs. Such intervals combined, define hyper-rectangles that partition the space of the problem. This structure, exploited by Interval DEA models and a dominance relation, acts as a DEA pre-processor, enabling the classification and prediction of efficiency scores, without applying any DEA models.Keywords: data envelopment analysis, interval DEA, efficiency classification, efficiency prediction
Procedia PDF Downloads 1631148 Inertial Motion Capture System for Biomechanical Analysis in Rehabilitation and Sports
Authors: Mario Sandro F. Rocha, Carlos S. Ande, Anderson A. Oliveira, Felipe M. Bersotti, Lucas O. Venzel
Abstract:
The inertial motion capture systems (mocap) are among the most suitable tools for quantitative clinical analysis in rehabilitation and sports medicine. The inertial measuring units (IMUs), composed by accelerometers, gyroscopes, and magnetometers, are able to measure spatial orientations and calculate displacements with sufficient precision for applications in biomechanical analysis of movement. Furthermore, this type of system is relatively affordable and has the advantages of portability and independence from external references. In this work, we present the last version of our inertial motion capture system, based on the foregoing technology, with a unity interface designed for rehabilitation and sports. In our hardware architecture, only one serial port is required. First, the board client must be connected to the computer by a USB cable. Next, an available serial port is configured and opened to establish the communication between the client and the application, and then the client starts scanning for the active MOCAP_S servers around. The servers play the role of the inertial measuring units that capture the movements of the body and send the data to the client, which in turn create a package composed by the ID of the server, the current timestamp, and the motion capture data defined in the client pre-configuration of the capture session. In the current version, we can measure the game rotation vector (grv) and linear acceleration (lacc), and we also have a step detector that can be abled or disabled. The grv data are processed and directly linked to the bones of the 3D model, and, along with the data of lacc and step detector, they are also used to perform the calculations of displacements and other variables shown on the graphical user interface. Our user interface was designed to calculate and present variables that are important for rehabilitation and sports, such as cadence, speed, total gait cycle, gait cycle length, obliquity and rotation, and center of gravity displacement. Our goal is to present a low-cost portable and wearable system with a friendly interface for application in biomechanics and sports, which also performs as a product of high precision and low consumption of energy.Keywords: biomechanics, inertial sensors, motion capture, rehabilitation
Procedia PDF Downloads 1391147 Subsidying Local Health Policy Programs as a Public Management Tool in the Polish Health Care System
Authors: T. Holecki, J. Wozniak-Holecka, P. Romaniuk
Abstract:
Due to the highly centralized model of financing health care in Poland, local self-government rarely undertook their own initiatives in the field of public health, particularly health promotion. However, since 2017 the possibility of applying for a subsidy to health policy programs has been allowed, with the additional resources to be retrieved from the National Health Fund, which is the dominant payer in the health system. The amount of subsidy depends on the number of inhabitants in a given unit and ranges about 40% of the total cost of the program. The aim of this paper is to assess the impact of newly implemented solutions in financing health policy on the management of public finances, as well as on the activity provided by local self-government in health promotion. An effort to estimate the amount of expenses that both local governments, and the National Health Fund, spent on local health policy programs while implementing the new solutions. The research method is the analysis of financial data obtained from the National Health Fund and from local government units, as well as reports published by the Agency for Health Technology Assessment and Pricing, which holds substantive control over the health policy programs, and releases permission for their implementation. The study was based on a comparative analysis of expenditures on the implementation of health programs in Poland in years 2010-2018. The presentation of the results includes the inclusion of average annual expenditures of local government units per 1 inhabitant, the total number of positively evaluated applications and the percentage share in total expenditures of local governments (16 voivodships areas). The most essential purpose is to determine whether the assumptions of the subsidy program are working correctly in practice, and what are the real effects of introducing legislative changes into local government levels in the context of public health tasks. The assumption of the study was that the use of a new motivation tool in the field of public management would result in multiplication of resources invested in the provision of health policy programs. Preliminary conclusions show that financial expenditures changed significantly after the introduction of public funding at the level of 40%, obtaining an increase in funding from own funds of local governments at the level of 80 to 90%.Keywords: health care system, health policy programs, local self-governments, public health management
Procedia PDF Downloads 1551146 Food Processing Technology and Packaging: A Case Study of Indian Cashew-Nut Industry
Authors: Parashram Jakappa Patil
Abstract:
India is the global leader in world cashew business and cashew-nut industry is one of the important food processing industries in world. However India is the largest producer, processor, exporter and importer eschew in the world. India is providing cashew to the rest of the world. India is meeting world demand of cashew. India has a tremendous potential of cashew production and export to other countries. Every year India earns more than 2000 cores rupees through cashew trade. Cashew industry is one of the important small scale industries in the country which is playing significant role in rural development. It is generating more than 400000 jobs at remote area and 95% cashew worker are women, it is giving income to poor cashew farmers, majority cashew processing units are small and cottage, it is helping to stop migration from young farmers for employment opportunities, it is motivation rural entrepreneurship development and it is also helping to environment protection etc. Hence India cashew business is very important agribusiness in India which has potential make inclusive development. World Bank and IMF recognized cashew-nut industry is one the important tool for poverty eradication at global level. It shows important of cashew business and its strong existence in India. In spite of such huge potential cashew processing industry is facing different problems such as lack of infrastructure ability, lack of supply of raw cashew, lack of availability of finance, collection of raw cashew, unavailability of warehouse, marketing of cashew kernels, lack of technical knowledge and especially processing technology and packaging of finished products. This industry has great prospects such as scope for more cashew cultivation and cashew production, employment generation, formation of cashew processing units, alcohols production from cashew apple, shield oil production, rural development, poverty elimination, development of social and economic backward class and environment protection etc. This industry has domestic as well as foreign market; India has tremendous potential in this regard. The cashew is a poor men’s crop but rich men’s food. The cashew is a source of income and livelihood for poor farmers. Cashew-nut industry may play very important role in the development of hilly region. The objectives of this paper are to identify problems of cashew processing and use of processing technology, problems of cashew kernel packaging, evolving of cashew processing technology over the year and its impact on final product and impact of good processing by adopting appropriate technology packaging on international trade of cashew-nut. The most important problem of cashew processing industry is that is processing and packaging. Bad processing reduce the quality of cashew kernel at large extent especially broken of cashew kernel which has very less price in market compare to whole cashew kernel and not eligible for export. On the other hand if there is no good packaging of cashew kernel will get moisture which destroy test of it. International trade of cashew-nut is depend of two things one is cashew processing and other is packaging. This study has strong relevance because cashew-nut industry is the labour oriented, where processing technology is not playing important role because 95% processing work is manual. Hence processing work was depending on physical performance of worker which makes presence of large workforce inevitable. There are many cashew processing units closed because they are not getting sufficient work force. However due to advancement in technology slowly this picture is changing and processing work get improve. Therefore it is interesting to explore all the aspects in context of cashew processing and packaging of cashew business.Keywords: cashew, processing technology, packaging, international trade, change
Procedia PDF Downloads 4201145 Multi-Objective Optimization of Combined System Reliability and Redundancy Allocation Problem
Authors: Vijaya K. Srivastava, Davide Spinello
Abstract:
This paper presents established 3n enumeration procedure for mixed integer optimization problems for solving multi-objective reliability and redundancy allocation problem subject to design constraints. The formulated problem is to find the optimum level of unit reliability and the number of units for each subsystem. A number of illustrative examples are provided and compared to indicate the application of the superiority of the proposed method.Keywords: integer programming, mixed integer programming, multi-objective optimization, Reliability Redundancy Allocation
Procedia PDF Downloads 1701144 Efficient Field-Oriented Motor Control on Resource-Constrained Microcontrollers for Optimal Performance without Specialized Hardware
Authors: Nishita Jaiswal, Apoorv Mohan Satpute
Abstract:
The increasing demand for efficient, cost-effective motor control systems in the automotive industry has driven the need for advanced, highly optimized control algorithms. Field-Oriented Control (FOC) has established itself as the leading approach for motor control, offering precise and dynamic regulation of torque, speed, and position. However, as energy efficiency becomes more critical in modern applications, implementing FOC on low-power, cost-sensitive microcontrollers pose significant challenges due to the limited availability of computational and hardware resources. Currently, most solutions rely on high-performance 32-bit microcontrollers or Application-Specific Integrated Circuits (ASICs) equipped with Floating Point Units (FPUs) and Hardware Accelerated Units (HAUs). These advanced platforms enable rapid computation and simplify the execution of complex control algorithms like FOC. However, these benefits come at the expense of higher costs, increased power consumption, and added system complexity. These drawbacks limit their suitability for embedded systems with strict power and budget constraints, where achieving energy and execution efficiency without compromising performance is essential. In this paper, we present an alternative approach that utilizes optimized data representation and computation techniques on a 16-bit microcontroller without FPUs or HAUs. By carefully optimizing data point formats and employing fixed-point arithmetic, we demonstrate how the precision and computational efficiency required for FOC can be maintained in resource-constrained environments. This approach eliminates the overhead performance associated with floating-point operations and hardware acceleration, providing a more practical solution in terms of cost, scalability and improved execution time efficiency, allowing faster response in motor control applications. Furthermore, it enhances system design flexibility, making it particularly well-suited for applications that demand stringent control over power consumption and costs.Keywords: field-oriented control, fixed-point arithmetic, floating point unit, hardware accelerator unit, motor control systems
Procedia PDF Downloads 131143 Air Handling Units Power Consumption Using Generalized Additive Model for Anomaly Detection: A Case Study in a Singapore Campus
Authors: Ju Peng Poh, Jun Yu Charles Lee, Jonathan Chew Hoe Khoo
Abstract:
The emergence of digital twin technology, a digital replica of physical world, has improved the real-time access to data from sensors about the performance of buildings. This digital transformation has opened up many opportunities to improve the management of the building by using the data collected to help monitor consumption patterns and energy leakages. One example is the integration of predictive models for anomaly detection. In this paper, we use the GAM (Generalised Additive Model) for the anomaly detection of Air Handling Units (AHU) power consumption pattern. There is ample research work on the use of GAM for the prediction of power consumption at the office building and nation-wide level. However, there is limited illustration of its anomaly detection capabilities, prescriptive analytics case study, and its integration with the latest development of digital twin technology. In this paper, we applied the general GAM modelling framework on the historical data of the AHU power consumption and cooling load of the building between Jan 2018 to Aug 2019 from an education campus in Singapore to train prediction models that, in turn, yield predicted values and ranges. The historical data are seamlessly extracted from the digital twin for modelling purposes. We enhanced the utility of the GAM model by using it to power a real-time anomaly detection system based on the forward predicted ranges. The magnitude of deviation from the upper and lower bounds of the uncertainty intervals is used to inform and identify anomalous data points, all based on historical data, without explicit intervention from domain experts. Notwithstanding, the domain expert fits in through an optional feedback loop through which iterative data cleansing is performed. After an anomalously high or low level of power consumption detected, a set of rule-based conditions are evaluated in real-time to help determine the next course of action for the facilities manager. The performance of GAM is then compared with other approaches to evaluate its effectiveness. Lastly, we discuss the successfully deployment of this approach for the detection of anomalous power consumption pattern and illustrated with real-world use cases.Keywords: anomaly detection, digital twin, generalised additive model, GAM, power consumption, supervised learning
Procedia PDF Downloads 1521142 Recommendations for Teaching Word Formation for Students of Linguistics Using Computer Terminology as an Example
Authors: Svetlana Kostrubina, Anastasia Prokopeva
Abstract:
This research presents a comprehensive study of the word formation processes in computer terminology within English and Russian languages and provides listeners with a system of exercises for training these skills. The originality is that this study focuses on a comparative approach, which shows both general patterns and specific features of English and Russian computer terms word formation. The key point is the system of exercises development for training computer terminology based on Bloom’s taxonomy. Data contain 486 units (228 English terms from the Glossary of Computer Terms and 258 Russian terms from the Terminological Dictionary-Reference Book). The objective is to identify the main affixation models in the English and Russian computer terms formation and to develop exercises. To achieve this goal, the authors employed Bloom’s Taxonomy as a methodological framework to create a systematic exercise program aimed at enhancing students’ cognitive skills in analyzing, applying, and evaluating computer terms. The exercises are appropriate for various levels of learning, from basic recall of definitions to higher-order thinking skills, such as synthesizing new terms and critically assessing their usage in different contexts. Methodology also includes: a method of scientific and theoretical analysis for systematization of linguistic concepts and clarification of the conceptual and terminological apparatus; a method of nominative and derivative analysis for identifying word-formation types; a method of word-formation analysis for organizing linguistic units; a classification method for determining structural types of abbreviations applicable to the field of computer communication; a quantitative analysis technique for determining the productivity of methods for forming abbreviations of computer vocabulary based on the English and Russian computer terms, as well as a technique of tabular data processing for a visual presentation of the results obtained. a technique of interlingua comparison for identifying common and different features of abbreviations of computer terms in the Russian and English languages. The research shows that affixation retains its productivity in the English and Russian computer terms formation. Bloom’s taxonomy allows us to plan a training program and predict the effectiveness of the compiled program based on the assessment of the teaching methods used.Keywords: word formation, affixation, computer terms, Bloom's taxonomy
Procedia PDF Downloads 91141 Specific Language Impirment in Kannada: Evidence Form a Morphologically Complex Language
Authors: Shivani Tiwari, Prathibha Karanth, B. Rajashekhar
Abstract:
Impairments of syntactic morphology are often considered central in children with Specific Language Impairment (SLI). In English and related languages, deficits of tense-related grammatical morphology could serve as a clinical marker of SLI. Yet, cross-linguistic studies on SLI in the recent past suggest that the nature and severity of morphosyntactic deficits in children with SLI varies with the language being investigated. Therefore, in the present study we investigated the morphosyntactic deficits in a group of children with SLI who speak Kannada, a morphologically complex Dravidian language spoken in Indian subcontinent. A group of 15 children with SLI participated in this study. Two more groups of typical developing children (15 each) matched for language and age to children with SLI, were included as control participants. All participants were assessed for morphosyntactic comprehension and expression using standardized language test and a spontaneous speech task. Results of the study showed that children with SLI differed significantly from age-matched but not language-matched control group, on tasks of both comprehension and expression of morphosyntax. This finding is, however, in contrast with the reports of English-speaking children with SLI who are reported to be poorer than younger MLU-matched children on tasks of morphosyntax. The observed difference in impairments of morphosyntax in Kannada-speaking children with SLI from English-speaking children with SLI is explained based on the morphological richness theory. The theory predicts that children with SLI perform relatively better in morphologically rich language due to occurrence of their frequent and consistent features that mark the morphological markers. The authors, therefore, conclude that language-specific features do influence manifestation of the disorder in children with SLI.Keywords: specific language impairment, morphosyntax, Kannada, manifestation
Procedia PDF Downloads 2411140 Evaluation of Teaching Team Stress Factors in Two Engineering Education Programs
Authors: Kari Bjorn
Abstract:
Team learning has been studied and modeled as double loop model and its variations. Also, metacognition has been suggested as a concept to describe the nature of team learning to be more than a simple sum of individual learning of the team members. Team learning has a positive correlation with both individual motivation of its members, as well as the collective factors within the team. Team learning of previously very independent members of two teaching teams is analyzed. Applied Science Universities are training future professionals with ever more diversified and multidisciplinary skills. The size of the units of teaching and learning are increasingly larger for several reasons. First, multi-disciplinary skill development requires more active learning and richer learning environments and learning experiences. This occurs on students teams. Secondly, teaching of multidisciplinary skills requires a multidisciplinary and team-based teaching from the teachers as well. Team formation phases have been identifies and widely accepted. Team role stress has been analyzed in project teams. Projects typically have a well-defined goal and organization. This paper explores team stress of two teacher teams in a parallel running two course units in engineering education. The first is an Industrial Automation Technology and the second is Development of Medical Devices. The courses have a separate student group, and they are in different campuses. Both are run in parallel within 8 week time. Both of them are taught by a group of four teachers with several years of teaching experience, but individually. The team role stress scale items - the survey is done to both teaching groups at the beginning of the course and at the end of the course. The inventory of questions covers the factors of ambiguity, conflict, quantitative role overload and qualitative role overload. Some comparison to the study on project teams can be drawn. Team development stage of the two teaching groups is different. Relating the team role stress factors to the development stage of the group can reveal the potential of management actions to promote team building and to understand the maturity of functional and well-established teams. Mature teams indicate higher job satisfaction and deliver higher performance. Especially, teaching teams who deliver highly intangible results of learning outcome are sensitive to issues in the job satisfaction and team conflicts. Because team teaching is increasing, the paper provides a review of the relevant theories and initial comparative and longitudinal results of the team role stress factors applied to teaching teams.Keywords: engineering education, stress, team role, team teaching
Procedia PDF Downloads 2241139 ALEF: An Enhanced Approach to Arabic-English Bilingual Translation
Authors: Abdul Muqsit Abbasi, Ibrahim Chhipa, Asad Anwer, Saad Farooq, Hassan Berry, Sonu Kumar, Sundar Ali, Muhammad Owais Mahmood, Areeb Ur Rehman, Bahram Baloch
Abstract:
Accurate translation between structurally diverse languages, such as Arabic and English, presents a critical challenge in natural language processing due to significant linguistic and cultural differences. This paper investigates the effectiveness of Facebook’s mBART model, fine-tuned specifically for sequence-tosequence (seq2seq) translation tasks between Arabic and English, and enhanced through advanced refinement techniques. Our approach leverages the Alef Dataset, a meticulously curated parallel corpus spanning various domains to capture the linguistic richness, nuances, and contextual accuracy essential for high-quality translation. We further refine the model’s output using advanced language models such as GPT-3.5 and GPT-4, which improve fluency, coherence, and correct grammatical errors in translated texts. The fine-tuned model demonstrates substantial improvements, achieving a BLEU score of 38.97, METEOR score of 58.11, and TER score of 56.33, surpassing widely used systems such as Google Translate. These results underscore the potential of mBART, combined with refinement strategies, to bridge the translation gap between Arabic and English, providing a reliable, context-aware machine translation solution that is robust across diverse linguistic contexts.Keywords: natural language processing, machine translation, fine-tuning, Arabic-English translation, transformer models, seq2seq translation, translation evaluation metrics, cross-linguistic communication
Procedia PDF Downloads 61138 Entropy in a Field of Emergence in an Aspect of Linguo-Culture
Authors: Nurvadi Albekov
Abstract:
Communicative situation is a basis, which designates potential models of ‘constructed forms’, a motivated basis of a text, for a text can be assumed as a product of the communicative situation. It is within the field of emergence the models of text, that can be potentially prognosticated in a certain communicative situation, are designated. Every text can be assumed as conceptual system structured on the base of certain communicative situation. However in the process of ‘structuring’ of a certain model of ‘conceptual system’ consciousness of a recipient is able act only within the border of the field of emergence for going out of this border indicates misunderstanding of the communicative situation. On the base of communicative situation we can witness the increment of meaning where the synergizing of the informative model of communication, formed by using of the invariant units of a language system, is a result of verbalization of the communicative situation. The potential of the models of a text, prognosticated within the field of emergence, also depends on the communicative situation. The conception ‘the field of emergence’ is interpreted as a unit of the language system, having poly-directed universal structure, implying the presence of the core, the center and the periphery, including different levels of means of a functioning system of language, both in terms of linguistic resources, and in terms of extra linguistic factors interaction of which results increment of a text. The conception ‘field of emergence’ is considered as the most promising in the analysis of texts: oral, written, printed and electronic. As a unit of the language system field of emergence has several properties that predict its use during the study of a text in different levels. This work is an attempt analysis of entropy in a text in the aspect of lingua-cultural code, prognosticated within the model of the field of emergence. The article describes the problem of entropy in the field of emergence, caused by influence of the extra-linguistic factors. The increasing of entropy is caused not only by the fact of intrusion of the language resources but by influence of the alien culture in a whole, and by appearance of non-typical for this very culture symbols in the field of emergence. The borrowing of alien lingua-cultural symbols into the lingua-culture of the author is a reason of increasing the entropy when constructing a text both in meaning and in structuring level. It is nothing but artificial formatting of lexical units that violate stylistic unity of a phrase. It is marked that one of the important characteristics descending the entropy in the field of emergence is a typical similarity of lexical and semantic resources of the different lingua-cultures in aspects of extra linguistic factors.Keywords: communicative situation, field of emergence, lingua-culture, entropy
Procedia PDF Downloads 3601137 The Effects of Integrating Knowledge Management and e-Learning: Productive Work and Learning Coverage
Authors: Ashraf Ibrahim Awad
Abstract:
It is important to formulate suitable learning environments ca-pable to be customized according to value perceptions of the university. In this paper, light is shed on the concepts of integration between knowledge management (KM), and e-learning (EL) in the higher education sector of the economy in Abu Dhabi Emirate, United Arab Emirates (UAE). A discussion on and how KM and EL can be integrated and leveraged for effective education and training is presented. The results are derived from the literature and interviews with 16 of the academics in eight universities in the Emirate. The conclusion is that KM and EL have much to offer each other, but this is not yet reflected at the implementation level, and their boundaries are not always clear. Interviews have shown that both concepts perceived to be closely related and, responsibilities for these initiatives are practiced by different departments or units.Keywords: knowledge management, e-learning, learning integration, universities, UAE
Procedia PDF Downloads 5051136 Research and Design of Functional Mixed Community: A Model Based on the Construction of New Districts in China
Authors: Wu Chao
Abstract:
The urban design of the new district in China is different from other existing cities at the city planning level, including Beijing, Shanghai, Guangzhou, etc. And the urban problems of these super-cities are same as many big cities around the world. The goal of the new district construction plan is to enable people to live comfortably, to improve the well-being of residents, and to create a way of life different from that of other urban communities. To avoid the emergence of the super community, the idea of "decentralization" is taken as the overall planning idea, and the function and form of each community are set up with a homogeneous allocation of resources so that the community can grow naturally. Similar to the growth of vines in nature, each community groups are independent and connected through roads, with clear community boundaries that limit their unlimited expansion. With a community contained 20,000 people as a case, the community is a mixture for living, production, office, entertainment, and other functions. Based on the development of the Internet, to create more space for public use, and can use data to allocate resources in real time. And this kind of shared space is the main part of the activity space in the community. At the same time, the transformation of spatial function can be determined by the usage feedback of all kinds of existing space, and the use of space can be changed by the changing data. Take the residential unit as the basic building function mass, take the lower three to four floors of the building as the main flexible space for use, distribute functions such as entertainment, service, office, etc. For the upper living space, set up a small amount of indoor and outdoor activity space, also used as shared space. The transformable space of the bottom layer is evenly distributed, combined with the walking space connected the community, the service and entertainment network can be formed in the whole community, and can be used in most of the community space. With the basic residential unit as the replicable module, the design of the other residential units runs through the idea of decentralization and the concept of the vine community, and the various units are reasonably combined. At the same time, a small number of office buildings are added to meet the special office needs. The new functional mixed community can change many problems of the present city in the future construction, at the same time, it can keep its vitality through the adjustment function of the Internet.Keywords: decentralization, mixed functional community, shared space, spatial usage data
Procedia PDF Downloads 1221135 Housing Recovery in Heavily Damaged Communities in New Jersey after Hurricane Sandy
Authors: Chenyi Ma
Abstract:
Background: The second costliest hurricane in U.S. history, Sandy landed in southern New Jersey on October 29, 2012, and struck the entire state with high winds and torrential rains. The disaster killed more than 100 people, left more than 8.5 million households without power, and damaged or destroyed more than 200,000 homes across the state. Immediately after the disaster, public policy support was provided in nine coastal counties that constituted 98% of the major and severely damaged housing units in NJ overall. The programs include Individuals and Households Assistance Program, Small Business Loan Program, National Flood Insurance Program, and the Federal Emergency Management Administration (FEMA) Public Assistance Grant Program. In the most severely affected counties, additional funding was provided through Community Development Block Grant: Reconstruction, Rehabilitation, Elevation, and Mitigation Program, and Homeowner Resettlement Program. How these policies individually and as a whole impacted housing recovery across communities with different socioeconomic and demographic profiles has not yet been studied, particularly in relation to damage levels. The concept of community social vulnerability has been widely used to explain many aspects of natural disasters. Nevertheless, how communities are vulnerable has been less fully examined. Community resilience has been conceptualized as a protective factor against negative impacts from disasters, however, how community resilience buffers the effects of vulnerability is not yet known. Because housing recovery is a dynamic social and economic process that varies according to context, this study examined the path from community vulnerability and resilience to housing recovery looking at both community characteristics and policy interventions. Sample/Methods: This retrospective longitudinal case study compared a literature-identified set of pre-disaster community characteristics, the effects of multiple public policy programs, and a set of time-variant community resilience indicators to changes in housing stock (operationally defined by percent of building permits to total occupied housing units/households) between 2010 and 2014, two years before and after Hurricane Sandy. The sample consisted of 51 municipalities in the nine counties in which between 4% and 58% of housing units suffered either major or severe damage. Structural equation modeling (SEM) was used to determine the path from vulnerability to the housing recovery, via multiple public programs, separately and as a whole, and via the community resilience indicators. The spatial analytical tool ArcGIS 10.2 was used to show the spatial relations between housing recovery patterns and community vulnerability and resilience. Findings: Holding damage levels constant, communities with higher proportions of Hispanic households had significantly lower levels of housing recovery while communities with households with an adult >age 65 had significantly higher levels of the housing recovery. The contrast was partly due to the different levels of total public support the two types of the community received. Further, while the public policy programs individually mediated the negative associations between African American and female-headed households and housing recovery, communities with larger proportions of African American, female-headed and Hispanic households were “vulnerable” to lower levels of housing recovery because they lacked sufficient public program support. Even so, higher employment rates and incomes buffered vulnerability to lower housing recovery. Because housing is the "wobbly pillar" of the welfare state, the housing needs of these particular groups should be more fully addressed by disaster policy.Keywords: community social vulnerability, community resilience, hurricane, public policy
Procedia PDF Downloads 3721134 Optimization of Radiation Therapy with a Nanotechnology Based Enzymatic Therapy
Authors: R. D. Esposito, V. M. Barberá, P. García Morales, P. Dorado Rodríguez, J. Sanz, M. Fuentes, D. Planes Meseguer, M. Saceda, L. Fernández Fornos, M. P. Ventero
Abstract:
Results obtained by our group on glioblastoma multiforme (GBM) primary cultures , show a dramatic potentiation of radiation effects when 2 units/ml of D-amino acid oxidase (DAO) enzyme are added, free or immobilized in magnetic nanoparticles, to irradiated samples just after the irradiation. Cell cultures were exposed to radiation doses of 7Gy and 15Gy of 6 MV photons from a clinical linear accelerator. At both doses, we observed a clear enhancing effect of radiation-induced damages due to the addition of DAO.Keywords: D-amino Acid Oxidase (DAO) enzyme, magnetic particles, nanotechnology, radiation therapy enhancement
Procedia PDF Downloads 5201133 The Study of Formal and Semantic Errors of Lexis by Persian EFL Learners
Authors: Mohammad J. Rezai, Fereshteh Davarpanah
Abstract:
Producing a text in a language which is not one’s mother tongue can be a demanding task for language learners. Examining lexical errors committed by EFL learners is a challenging area of investigation which can shed light on the process of second language acquisition. Despite the considerable number of investigations into grammatical errors, few studies have tackled formal and semantic errors of lexis committed by EFL learners. The current study aimed at examining Persian learners’ formal and semantic errors of lexis in English. To this end, 60 students at three different proficiency levels were asked to write on 10 different topics in 10 separate sessions. Finally, 600 essays written by Persian EFL learners were collected, acting as the corpus of the study. An error taxonomy comprising formal and semantic errors was selected to analyze the corpus. The formal category covered misselection and misformation errors, while the semantic errors were classified into lexical, collocational and lexicogrammatical categories. Each category was further classified into subcategories depending on the identified errors. The results showed that there were 2583 errors in the corpus of 9600 words, among which, 2030 formal errors and 553 semantic errors were identified. The most frequent errors in the corpus included formal error commitment (78.6%), which were more prevalent at the advanced level (42.4%). The semantic errors (21.4%) were more frequent at the low intermediate level (40.5%). Among formal errors of lexis, the highest number of errors was devoted to misformation errors (98%), while misselection errors constituted 2% of the errors. Additionally, no significant differences were observed among the three semantic error subcategories, namely collocational, lexical choice and lexicogrammatical. The results of the study can shed light on the challenges faced by EFL learners in the second language acquisition process.Keywords: collocational errors, lexical errors, Persian EFL learners, semantic errors
Procedia PDF Downloads 1401132 Teaching–Learning-Based Optimization: An Efficient Method for Chinese as a Second Language
Authors: Qi Wang
Abstract:
In the classroom, teachers have been trained to complete the target task within the limited lecture time, meanwhile learners need to receive a lot of new knowledge, however, most of the time the learners come without the proper pre-class preparation to efficiently take in the contents taught in class. Under this circumstance, teachers do have no time to check whether the learners fully understand the content or not, how the learners communicate in the different contexts, until teachers see the results when the learners are tested. In the past decade, the teaching of Chinese has taken a trend. Teaching focuses less on the use of proper grammatical terms/punctuation and is now placing a heavier focus on the materials from real life contexts. As a result, it has become a greater challenge to teachers, as this requires teachers to fully understand/prepare what they teach and explain the content with simple and understandable words to learners. On the other hand, the same challenge also applies to the learners, who come from different countries. As they have to use what they learnt, based on their personal understanding of the material to effectively communicate with others in the classroom, even in the contexts of a day to day communication. To reach this win-win stage, Feynman’s Technique plays a very important role. This practical report presents you how the Feynman’s Technique is applied into Chinese courses, both writing & oral, to motivate the learners to practice more on writing, reading and speaking in the past few years. Part 1, analysis of different teaching styles and different types of learners, to find the most efficient way to both teachers and learners. Part 2, based on the theory of Feynman’s Technique, how to let learners build the knowledge from knowing the name of something to knowing something, via different designed target tasks. Part 3. The outcomes show that Feynman’s Technique is the interaction of learning style and teaching style, the double-edged sword of Teaching & Learning Chinese as a Second Language.Keywords: Chinese, Feynman’s technique, learners, teachers
Procedia PDF Downloads 1531131 CFD Modeling of Stripper Ash Cooler of Circulating Fluidized Bed
Authors: Ravi Inder Singh
Abstract:
Due to high heat transfer rate, high carbon utilizing efficiency, fuel flexibilities and other advantages numerous circulating fluidized bed boilers have grown up in India in last decade. Many companies like BHEL, ISGEC, Thermax, Cethar Limited, Enmas GB Power Systems Projects Limited are making CFBC and installing the units throughout the India. Due to complexity many problems exists in CFBC units and only few have been reported. Agglomeration i.e clinker formation in riser, loop seal leg and stripper ash coolers is one of problem industry is facing. Proper documentation is rarely found in the literature. Circulating fluidized bed (CFB) boiler bottom ash contains large amounts of physical heat. While the boiler combusts the low-calorie fuel, the ash content is normally more than 40% and the physical heat loss is approximately 3% if the bottom ash is discharged without cooling. In addition, the red-hot bottom ash is bad for mechanized handling and transportation, as the upper limit temperature of the ash handling machinery is 200 °C. Therefore, a bottom ash cooler (BAC) is often used to treat the high temperature bottom ash to reclaim heat, and to have the ash easily handled and transported. As a key auxiliary device of CFB boilers, the BAC has a direct influence on the secure and economic operation of the boiler. There are many kinds of BACs equipped for large-scale CFB boilers with the continuous development and improvement of the CFB boiler. These ash coolers are water cooled ash cooling screw, rolling-cylinder ash cooler (RAC), fluidized bed ash cooler (FBAC).In this study prototype of a novel stripper ash cooler is studied. The Circulating Fluidized bed Ash Coolers (CFBAC) combined the major technical features of spouted bed and bubbling bed, and could achieve the selective discharge on the bottom ash. The novel stripper ash cooler is bubbling bed and it is visible cold test rig. The reason for choosing cold test is that high temperature is difficult to maintain and create in laboratory level. The aim of study to know the flow pattern inside the stripper ash cooler. The cold rig prototype is similar to stripper ash cooler used industry and it was made after scaling down to some parameter. The performance of a fluidized bed ash cooler is studied using a cold experiment bench. The air flow rate, particle size of the solids and air distributor type are considered to be the key parameters of the operation of a fluidized bed ash cooler (FBAC) are studied in this.Keywords: CFD, Eulerian-Eulerian, Eulerian-Lagraingian model, parallel simulations
Procedia PDF Downloads 508