Search results for: mapping methodologies
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2091

Search results for: mapping methodologies

1731 Multi-Temporal Mapping of Built-up Areas Using Daytime and Nighttime Satellite Images Based on Google Earth Engine Platform

Authors: S. Hutasavi, D. Chen

Abstract:

The built-up area is a significant proxy to measure regional economic growth and reflects the Gross Provincial Product (GPP). However, an up-to-date and reliable database of built-up areas is not always available, especially in developing countries. The cloud-based geospatial analysis platform such as Google Earth Engine (GEE) provides an opportunity with accessibility and computational power for those countries to generate the built-up data. Therefore, this study aims to extract the built-up areas in Eastern Economic Corridor (EEC), Thailand using day and nighttime satellite imagery based on GEE facilities. The normalized indices were generated from Landsat 8 surface reflectance dataset, including Normalized Difference Built-up Index (NDBI), Built-up Index (BUI), and Modified Built-up Index (MBUI). These indices were applied to identify built-up areas in EEC. The result shows that MBUI performs better than BUI and NDBI, with the highest accuracy of 0.85 and Kappa of 0.82. Moreover, the overall accuracy of classification was improved from 79% to 90%, and error of total built-up area was decreased from 29% to 0.7%, after night-time light data from the Visible and Infrared Imaging Suite (VIIRS) Day Night Band (DNB). The results suggest that MBUI with night-time light imagery is appropriate for built-up area extraction and be utilize for further study of socioeconomic impacts of regional development policy over the EEC region.

Keywords: built-up area extraction, google earth engine, adaptive thresholding method, rapid mapping

Procedia PDF Downloads 103
1730 Mapping Thermal Properties Using Resistivity, Lithology and Thermal Conductivity Measurements

Authors: Riccardo Pasquali, Keith Harlin, Mark Muller

Abstract:

The ShallowTherm project is focussed on developing and applying a methodology for extrapolating relatively sparsely sampled thermal conductivity measurements across Ireland using mapped Litho-Electrical (LE) units. The primary data used consist of electrical resistivities derived from the Geological Survey Ireland Tellus airborne electromagnetic dataset, GIS-based maps of Irish geology, and rock thermal conductivities derived from both the current Irish Ground Thermal Properties (IGTP) database and a new programme of sampling and laboratory measurement. The workflow has been developed across three case-study areas that sample a range of different calcareous, arenaceous, argillaceous, and volcanic lithologies. Statistical analysis of resistivity data from individual geological formations has been assessed and integrated with detailed lithological descriptions to define distinct LE units. Thermal conductivity measurements from core and hand samples have been acquired for every geological formation within each study area. The variability and consistency of thermal conductivity measurements within each LE unit is examined with the aim of defining a characteristic thermal conductivity (or range of thermal conductivities) for each LE unit. Mapping of LE units, coupled with characteristic thermal conductivities, provides a method of defining thermal conductivity properties at a regional scale and facilitating the design of ground source heat pump closed-loop collectors.

Keywords: thermal conductivity, ground source heat pumps, resistivity, heat exchange, shallow geothermal, Ireland

Procedia PDF Downloads 154
1729 Uncovering Hidden Bugs: An Exploratory Approach

Authors: Sagar Jitendra Mahendrakar

Abstract:

Exploratory testing is a dynamic and adaptable method of software quality assurance that is frequently praised for its ability to find hidden flaws and improve the overall quality of the product. Instead of using preset test cases, exploratory testing allows testers to explore the software application dynamically. This is in contrast to scripted testing methodologies, which primarily rely on tester intuition, creativity, and adaptability. There are several tools and techniques that can aid testers in the exploratory testing process which we will be discussing in this talk.Tests of this kind are able to find bugs of this kind that are harder to find during structured testing or that other testing methods may have overlooked.The purpose of this abstract is to examine the nature and importance of exploratory testing in modern software development methods. It explores the fundamental ideas of exploratory testing, highlighting the value of domain knowledge and tester experience in spotting possible problems that may escape the notice of traditional testing methodologies. Throughout the software development lifecycle, exploratory testing promotes quick feedback loops and continuous improvement by giving testers the ability to make decisions in real time based on their observations. This abstract also clarifies the unique features of exploratory testing, like its non-linearity and capacity to replicate user behavior in real-world settings. Testers can find intricate bugs, usability problems, and edge cases in software through impromptu exploration that might go undetected. Exploratory testing's flexible and iterative structure fits in well with agile and DevOps processes, allowing for a quicker time to market without sacrificing the quality of the final product.

Keywords: exploratory, testing, automation, quality

Procedia PDF Downloads 16
1728 The Teaching and Learning Process and Information and Communication Technologies from the Remote Perspective

Authors: Rosiris Maturo Domingues, Patricia Luissa Masmo, Cibele Cavalheiro Neves, Juliana Dalla Martha Rodriguez

Abstract:

This article reports the experience of the pedagogical consultants responsible for the curriculum development of Senac São Paulo courses when facing the emergency need to maintain the pedagogical process in their schools in the face of the Covid-19 pandemic. The urgent adjustment to distance education resulted in the improvement of the process and the adoption of new teaching and learning strategies mediated by technologies. The processes for preparing and providing guidelines for professional education courses were also readjusted. Thus, a bank of teaching-learning strategies linked to digital resources was developed, categorized, and identified by their didactic-pedagogical potential, having as an intersection didactic planning based on learning objectives based on Bloom's taxonomy (revised), given its convergence with the competency approach adopted by Senac. Methodologically, a relationship was established between connectivity and digital networks and digital evolution in school environments, culminating in new paradigms and processes of educational communication and new trends in teaching and learning. As a result, teachers adhered to the use of digital tools in their practices, transposing face-to-face classroom methodologies and practices to online media, whose criticism was the use of ICTs in an instrumental way, reducing methodologies and practices to teaching only transmissive. There was recognition of the insertion of technology as a facilitator of the educational process in a non-palliative way and the development of a web curriculum, now and fully, carried out in contexts of ubiquity.

Keywords: technologies, education, teaching-learning strategies, Bloom taxonomy

Procedia PDF Downloads 58
1727 Applications of Space Technology in Flood Risk Mapping in Parts of Haryana State, India

Authors: B. S. Chaudhary

Abstract:

The severity and frequencies of different disasters on the globe is increasing in recent years. India is also facing the disasters in the form of drought, cyclone, earthquake, landslides, and floods. One of the major causes of disasters in northern India is flood. There are great losses and extensive damage to the agricultural crops, property, human, and animal life. This is causing environmental imbalances at places. The annual global figures for losses due to floods run into over 2 billion dollar. India is a vast country with wide variations in climate and topography. Due to widespread and heavy rainfall during the monsoon months, floods of varying magnitude occur all over the country during June to September. The magnitude depends upon the intensity of rainfall, its duration and also the ground conditions at the time of rainfall. Haryana, one of the agriculturally dominated northern states is also suffering from a number of disasters such as floods, desertification, soil erosion, land degradation etc. Earthquakes are also frequently occurring but of small magnitude so are not causing much concern and damage. Most of the damage in Haryana is due to floods. Floods in Haryana have occurred in 1978, 1988, 1993, 1995, 1998, and 2010 to mention a few. The present paper deals with the Remote Sensing and GIS applications in preparing flood risk maps in parts of Haryana State India. The satellite data of various years have been used for mapping of flood affected areas. The Flooded areas have been interpreted both visually and digitally and two classes-flooded and receded water/ wet areas have been identified for each year. These have been analyzed in GIS environment to prepare the risk maps. This shows the areas of high, moderate and low risk depending on the frequency of flood witness. The floods leave a trail of suffering in the form of unhygienic conditions due to improper sanitation, water logging, filth littered in the area, degradation of materials and unsafe drinking water making the people prone to many type diseases in short and long run. Attempts have also been made to enumerate the causes of floods. The suggestions are given for mitigating the fury of floods and proper management issues related to evacuation and safe places nearby.

Keywords: flood mapping, GIS, Haryana, India, remote sensing, space technology

Procedia PDF Downloads 189
1726 Applying Lean Six Sigma in an Emergency Department, of a Private Hospital

Authors: Sarah Al-Lumai, Fatima Al-Attar, Nour Jamal, Badria Al-Dabbous, Manal Abdulla

Abstract:

Today, many commonly used Industrial Engineering tools and techniques are being used in hospitals around the world for the goal of producing a more efficient and effective healthcare system. A common quality improvement methodology known as Lean Six-Sigma has been successful in manufacturing industries and recently in healthcare. The objective of our project is to use the Lean Six-Sigma methodology to reduce waiting time in the Emergency Department (ED), in a local private hospital. Furthermore, a comprehensive literature review was conducted to evaluate the success of Lean Six-Sigma in the ED. According to the study conducted by Ibn Sina Hospital, in Morocco, the most common problem that patients complain about is waiting time. To ensure patient satisfaction many hospitals such as North Shore University Hospital were able to reduce waiting time up to 37% by using Lean Six-Sigma. Other hospitals, such as John Hopkins’s medical center used Lean Six-Sigma successfully to enhance the overall patient flow that ultimately decreased waiting time. Furthermore, it was found that capacity constraints, such as staff shortages and lack of beds were one of the main reasons behind long waiting time. With the use of Lean Six-Sigma and bed management, hospitals like Memorial Hermann Southwest Hospital were able to reduce patient delays. Moreover, in order to successfully implement Lean Six-Sigma in our project, two common methodologies were considered, DMAIC and DMADV. After the assessment of both methodologies, it was found that DMAIC was a more suitable approach to our project because it is more concerned with improving an already existing process. With many of its successes, Lean Six-Sigma has its limitation especially in healthcare; but limitations can be minimized if properly approached.

Keywords: lean six sigma, DMAIC, hospital, methodology

Procedia PDF Downloads 470
1725 Geophysical Mapping of the Groundwater Aquifer System in Gode Area, Northeastern Hosanna, Ethiopia

Authors: Esubalew Yehualaw Melaku

Abstract:

In this study, two basic geophysical methods are applied for mapping the groundwater aquifer system in the Gode area along the Guder River, northeast of Hosanna town, near the western margin of the Central Main Ethiopian Rift. The main target of the study is to map the potential aquifer zone and investigate the groundwater potential for current and future development of the resource in the Gode area. The geophysical methods employed in this study include, Vertical Electrical Sounding (VES) and magnetic survey techniques. Electrical sounding was used to examine and map the depth to the potential aquifer zone of the groundwater and its distribution over the area. On the other hand, a magnetic survey was used to delineate contact between lithologic units and geological structures. The 2D magnetic modeling and the geoelectric sections are used for the identification of weak zones, which control the groundwater flow and storage system. The geophysical survey comprises of twelve VES readings collected by using a Schlumberger array along six profile lines and more than four hundred (400) magnetic readings at about 10m station intervals along four profiles and 20m along three random profiles. The study result revealed that the potential aquifer in the area is obtained at a depth range from 45m to 92m. This is the response of the highly weathered/ fractured ignimbrite and pumice layer with sandy soil, which is the main water-bearing horizon. Overall, in the neighborhood of four VES points, VES- 2, VES- 3, VES-10, and VES-11, shows good water-bearing zones in the study area.

Keywords: vertical electrical sounding, magnetic survey, aquifer, groundwater potential

Procedia PDF Downloads 97
1724 Medical Imaging Fusion: A Teaching-Learning Simulation Environment

Authors: Cristina Maria Ribeiro Martins Pereira Caridade, Ana Rita Ferreira Morais

Abstract:

The use of computational tools has become essential in the context of interactive learning, especially in engineering education. In the medical industry, teaching medical image processing techniques is a crucial part of training biomedical engineers, as it has integrated applications with healthcare facilities and hospitals. The aim of this article is to present a teaching-learning simulation tool developed in MATLAB using a graphical user interface for medical image fusion that explores different image fusion methodologies and processes in combination with image pre-processing techniques. The application uses different algorithms and medical fusion techniques in real time, allowing you to view original images and fusion images, compare processed and original images, adjust parameters, and save images. The tool proposed in an innovative teaching and learning environment consists of a dynamic and motivating teaching simulation for biomedical engineering students to acquire knowledge about medical image fusion techniques and necessary skills for the training of biomedical engineers. In conclusion, the developed simulation tool provides real-time visualization of the original and fusion images and the possibility to test, evaluate and progress the student’s knowledge about the fusion of medical images. It also facilitates the exploration of medical imaging applications, specifically image fusion, which is critical in the medical industry. Teachers and students can make adjustments and/or create new functions, making the simulation environment adaptable to new techniques and methodologies.

Keywords: image fusion, image processing, teaching-learning simulation tool, biomedical engineering education

Procedia PDF Downloads 94
1723 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data

Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton

Abstract:

The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.

Keywords: analytics, digitization, industry 4.0, manufacturing

Procedia PDF Downloads 85
1722 Cognitive Model of Analogy Based on Operation of the Brain Cells: Glial, Axons and Neurons

Authors: Ozgu Hafizoglu

Abstract:

Analogy is an essential tool of human cognition that enables connecting diffuse and diverse systems with attributional, deep structural, casual relations that are essential to learning, to innovation in artificial worlds, and to discovery in science. Cognitive Model of Analogy (CMA) leads and creates information pattern transfer within and between domains and disciplines in science. This paper demonstrates the Cognitive Model of Analogy (CMA) as an evolutionary approach to scientific research. The model puts forward the challenges of deep uncertainty about the future, emphasizing the need for flexibility of the system in order to enable reasoning methodology to adapt to changing conditions. In this paper, the model of analogical reasoning is created based on brain cells, their fractal, and operational forms within the system itself. Visualization techniques are used to show correspondences. Distinct phases of the problem-solving processes are divided thusly: encoding, mapping, inference, and response. The system is revealed relevant to brain activation considering each of these phases with an emphasis on achieving a better visualization of the brain cells: glial cells, axons, axon terminals, and neurons, relative to matching conditions of analogical reasoning and relational information. It’s found that encoding, mapping, inference, and response processes in four-term analogical reasoning are corresponding with the fractal and operational forms of brain cells: glial, axons, and neurons.

Keywords: analogy, analogical reasoning, cognitive model, brain and glials

Procedia PDF Downloads 163
1721 Localization of Frontal and Temporal Speech Areas in Brain Tumor Patients by Their Structural Connections with Probabilistic Tractography

Authors: B.Shukir, H.Woo, P.Barzo, D.Kis

Abstract:

Preoperative brain mapping in tumors involving the speech areas has an important role to reduce surgical risks. Functional magnetic resonance imaging (fMRI) is the gold standard method to localize cortical speech areas preoperatively, but its availability in clinical routine is difficult. Diffusion MRI based probabilistic tractography is available in head MRI. It’s used to segment cortical subregions by their structural connectivity. In our study, we used probabilistic tractography to localize the frontal and temporal cortical speech areas. 15 patients with left frontal tumor were enrolled to our study. Speech fMRI and diffusion MRI acquired preoperatively. The standard automated anatomical labelling atlas 3 (AAL3) cortical atlas used to define 76 left frontal and 118 left temporal potential speech areas. 4 types of tractography were run according to the structural connection of these regions to the left arcuate fascicle (FA) to localize those cortical areas which have speech functions: 1, frontal through FA; 2, frontal with FA; 3, temporal to FA; 4, temporal with FA connections were determined. Thresholds of 1%, 5%, 10% and 15% applied. At each level, the number of affected frontal and temporal regions by fMRI and tractography were defined, the sensitivity and specificity were calculated. At the level of 1% threshold showed the best results. Sensitivity was 61,631,4% and 67,1523,12%, specificity was 87,210,4% and 75,611,37% for frontal and temporal regions, respectively. From our study, we conclude that probabilistic tractography is a reliable preoperative technique to localize cortical speech areas. However, its results are not feasible that the neurosurgeon rely on during the operation.

Keywords: brain mapping, brain tumor, fMRI, probabilistic tractography

Procedia PDF Downloads 132
1720 Aging-Related Changes in Calf Muscle Function: Implications for Venous Hemodynamic and the Role of External Mechanical Activation

Authors: Bhavatharani S., Boopathy V., Kavin S., Naveethkumar R.

Abstract:

Context: Resistance training with blood flow restriction (BFR) has increased in clinical rehabilitation due to the substantial benefits observed in augmenting muscle mass and strength using low loads. However, there is a great variability of training pressures for clinical populations as well as methods to estimate it. The aim of this study was to estimate the percentage of maximal BFR that could result by applying different methodologies based on arbitrary or individual occlusion levels using a cuff width between 9 and 13 cm. Design: A secondary analysis was performed on the combined databases of 2 previous larger studies using BFR training. Methods: To estimate these percentages, the occlusion values needed to reach complete BFR (100% limb occlusion pressure [LOP]) were estimated by Doppler ultrasound. Seventy-five participants (age 24.32 [4.86] y; weight: 78.51 [14.74] kg; height: 1.77 [0.09] m) were enrolled in the laboratory study for measuring LOP in the thigh, arm, or calf. Results: When arbitrary values of restriction are applied, a supra-occlusive LOP between 120% and 190% LOP may result. Furthermore, the application of 130% resting brachial systolic blood pressure creates a similar occlusive stimulus as 100% LOP. Conclusions: Methods using 100 mm Hg and the resting brachial systolic blood pressure could represent the safest application prescriptions as they resulted in applied pressures between 60% and 80% LOP. One hundred thirty percent of the resting brachial systolic blood pressure could be used to indirectly estimate 100% LOP at cuff widths between 9 and 13 cm. Finally, methodologies that use standard values of 200 and, 300 mm Hg far exceed LOP and may carry additional risk during BFR exercise.

Keywords: lower limb rehabilitation, ESP32, pneumatics for medical, programmed rehabilitation

Procedia PDF Downloads 57
1719 Top Skills That Build Cultures at Organizations

Authors: Priyanka Botny Srinath, Alessandro Suglia, Mel McKendrick

Abstract:

Background: Organizational cultural studies integrate sociology and anthropology, portraying man as a creator of symbols, languages, beliefs, and ideologies -essentially, a creator and manager of meaning. In our research, we leverage analytical measures to discern whether an organization embodies a singular culture or a myriad of subcultures. Fast-forward to 2023, our research thesis focuses on digitally measuring culture, coining it as the "Work Culture Quotient." This entails conceptually mapping common experiential patterns to provide executives insights into the digital organization journey, aiding in understanding their current position and identifying future steps. Objectives: Finding the new age skills that help in defining the culture; understand the implications of post-COVID effects; derive a digital framework for measuring skillsets. Method: We conducted two comprehensive Delphi studies to distill essential insights. Delphi 1: Through a thematic analysis of interviews with 20 high-level leaders representing companies across diverse regions -India, Japan, the US, Canada, Morocco, and Uganda- we identified 20 key skills critical for cultivating a robust organizational culture. The skills are -influence, self-confidence, optimism, empathy, leadership, collaboration and cooperation, developing others, commitment, innovativeness, leveraging diversity, change management, team capabilities, self-control, digital communication, emotional awareness, team bonding, communication, problem solving, adaptability, and trustworthiness. Delphi 2: Subject matter experts were asked to complete a questionnaire derived from the thematic analysis in stage 1 to formalise themes and draw consensus amongst experts on the most important workplace skills. Results: The thematic analysis resulted in 20 workplace employee skills being identified. These skills were all included in the Delphi round 2 questionnaire. From the outputs, we analysed the data using R Studio for arriving at agreement and consensus, we also used sum of squares method to compare various agreements to extract various themes with a threshold of 80% agreements. This yielded three themes at over 80% agreement (leadership, collaboration and cooperation, communication) and three further themes at over 60% agreement (commitment, empathy, trustworthiness). From this, we selected five questionnaires to be included in the primary data collection phase, and these will be paired with the digital footprints to provide a workplace culture quotient. Implications: The findings from these studies bear profound implications for decision-makers, revolutionizing their comprehension of organizational culture. Tackling the challenge of mapping the digital organization journey involves innovative methodologies that probe not only external landscapes but also internal cultural dynamics. This holistic approach furnishes decision-makers with a nuanced understanding of their organizational culture and visualizes pivotal skills for employee growth. This clarity enables informed choices resonating with the organization's unique cultural fabric. Anticipated outcomes transcend mere individual cultural measurements, aligning with organizational goals to unveil a comprehensive view of culture, exposing artifacts and depth. Armed with this profound understanding, decision-makers gain tangible evidence for informed decision-making, strategically leveraging cultural strengths to cultivate an environment conducive to growth, innovation, and enduring success, ultimately leading to measurable outcomes.

Keywords: leadership, cooperation, collaboration, teamwork, work culture

Procedia PDF Downloads 25
1718 Immigrant Women's Voices and Integrating Feminism into Migration Theory

Authors: Florence Nyemba, Rufaro Chitiyo

Abstract:

This work features the voices of women as they describe their experiences living in the diaspora either with their families or alone. The contributing authors of this work pursued this project to understand how the women’s personal lives (and those of their families back home) changed (both positively and negatively). The work addressed the following important questions, what is female migration? What are the factors causing women to migrate? What types of migration do women engage in? What is the influence of family relationships on migration? What are the challenges of migration? How do migrant women maintain ties with their home countries? What is the role of social networks in migration? How can feminist theories and methodologies be incorporated in migration studies? Women continue to contribute significantly to mass movements of people across the yet, their voices silent in the literature on migration. History shows that women have always been on the move trying to make a living just like their male counterparts. Whether they migrate as spouses, daughters, or alone, women make up a sizeable portion of migration statistics around the world. These women are migrating independently without the accompaniment of male relatives. This calls for the need to expand research on women as independent migrants without generalizing their experiences as in the case with early studies on international migration. The goal of this work is to offer a rich and detailed description of the lives of immigrant women across the globe using theoretical frameworks that advance gender and migration research. Methodology: This work invited scholars and researchers from across the globe whose research interests were in gender and migration. The work incorporated a variety of methodologies for data collection and analysis, which included oral narratives, interviews, systematic literature reviews and interviews. Conclusion: There is a considerable amount of interest in various topics on gender, violence, and equality throughout social science disciplines in higher education. Therefore, the three major topics covered in this work, Women’s Immigration: Theories and Methodologies, Women as Migrant Workers, and Women as Refugees, Asylees, and Permanent Migrants, can be of interest across social sciences disciplines. Feminist theories can expand the curriculum on identity and gendered roles and norms in societies. Findings of this work advance knowledge of population movements across the globe. This work will also appeal to students and scholars wanting to expand their knowledge on women and migration, migration theories, gender violence, and women empowerment. The topics and issues presented in this work will also assist the international community and lawyers concerned with global migration.

Keywords: gender, feminism, identity formation, international migration

Procedia PDF Downloads 109
1717 The Utilization of Tea Extract within the Realm of the Food Industry

Authors: Raana Babadi Fathipour

Abstract:

Tea, a beverage widely cherished across the globe, has captured the interest of scholars with its recent acknowledgement for possessing noteworthy health advantages. Of particular significance is its proven ability to ward off ailments such as cancer and cardiovascular afflictions. Moreover, within the realm of culinary creations, lipid oxidation poses a significant challenge for food product development. In light of these aforementioned concerns, this present discourse turns its attention towards exploring diverse methodologies employed in extracting polyphenols from various types of tea leaves and examining their utility within the vast landscape of the ever-evolving food industry. Based on the discoveries unearthed in this comprehensive investigation, it has been determined that the fundamental constituents of tea are polyphenols possessed of intrinsic health-enhancing properties. This includes an assortment of catechins, namely epicatechin, epigallocatechin, epicatechin gallate, and epigallocatechin gallate. Moreover, gallic acid, flavonoids, flavonols and theaphlavins have also been detected within this aromatic beverage. Of these myriad components examined vigorously in this study's analysis, catechin emerges as particularly beneficial. Multiple techniques have emerged over time to successfully extract key compounds from tea plants, including solvent-based extraction methodologies, microwave-assisted water extraction approaches and ultrasound-assisted extraction techniques. In particular, consideration is given to microwave-assisted water extraction method as a viable scheme which effectively procures valuable polyphenols from tea extracts. This methodology appears adaptable for implementation within sectors such as dairy production along with meat and oil industries alike.

Keywords: camellia sinensis, extraction, food application, shelf life, tea

Procedia PDF Downloads 47
1716 Student's Difficulties with Classes That Involve Laboratory Education Approach

Authors: Kayondoamunmose Kamafrika

Abstract:

Experimental based Engineering education approach plays a vital role in the development of student’s deep understanding of both social and physical sciences. Experimental based education approach through laboratory class activities prepare students to meet national demand for high-tech skilled individuals in the government and private sector. However, students across the country are faced with difficulties in classes that involve laboratory activities: poor experimental based exposure in their early development of student’s education-life-cycle, lack of student engagement in scientific method practical thinking approach, lack of communication between students and the instructor during class, a large number of students in one classroom, lack of instruments and improper equipment calibration. The purpose of this paper is to help students develop their own scientific knowledge and understanding, develop their methodologies in the design of experiments, collect and analyze data, write laboratory reports, present and explain their findings. Experimental based laboratory activities allow students to learn with high-level understanding as well as engage in the design processes of constructing knowledge through practical means of doing science. Experimental based education systems approach will act as a catalyst in the development of practical-based-educational methodologies in social and physical science and engineering domain of learning; thereby, converting laboratory classes into pilot industries and students into professional experts in finding a solution for complex problems, research, and development of super high- tech systems.

Keywords: experimental, engineering, innovation, practicability

Procedia PDF Downloads 160
1715 Urban Land Use Type Analysis Based on Land Subsidence Areas Using X-Band Satellite Image of Jakarta Metropolitan City, Indonesia

Authors: Ratih Fitria Putri, Josaphat Tetuko Sri Sumantyo, Hiroaki Kuze

Abstract:

Jakarta Metropolitan City is located on the northwest coast of West Java province with geographical location between 106º33’ 00”-107º00’00”E longitude and 5º48’30”-6º24’00”S latitude. Jakarta urban area has been suffered from land subsidence in several land use type as trading, industry and settlement area. Land subsidence hazard is one of the consequences of urban development in Jakarta. This hazard is caused by intensive human activities in groundwater extraction and land use mismanagement. Geologically, the Jakarta urban area is mostly dominated by alluvium fan sediment. The objectives of this research are to make an analysis of Jakarta urban land use type on land subsidence zone areas. The process of producing safer land use and settlements of the land subsidence areas are very important. Spatial distributions of land subsidence detection are necessary tool for land use management planning. For this purpose, Differential Synthetic Aperture Radar Interferometry (DInSAR) method is used. The DInSAR is complementary to ground-based methods such as leveling and global positioning system (GPS) measurements, yielding information in a wide coverage area even when the area is inaccessible. The data were fine tuned by using X-Band image satellite data from 2010 to 2013 and land use mapping data. Our analysis of land use type that land subsidence movement occurred on the northern part Jakarta Metropolitan City varying from 7.5 to 17.5 cm/year as industry and settlement land use type areas.

Keywords: land use analysis, land subsidence mapping, urban area, X-band satellite image

Procedia PDF Downloads 253
1714 Capacity Building in Dietary Monitoring and Public Health Nutrition in the Eastern Mediterranean Region

Authors: Marisol Warthon-Medina, Jenny Plumb, Ayoub Aljawaldeh, Mark Roe, Ailsa Welch, Maria Glibetic, Paul M. Finglas

Abstract:

Similar to Western Countries, the Eastern Mediterranean Region (EMR) also presents major public health issues associated with the increased consumption of sugar, fat, and salt. Therefore, one of the policies of the World Health Organization’s (WHO) EMR is to reduce the intake of salt, sugar, and fat (Saturated fatty acids, trans fatty acids) to address the risk of non-communicable diseases (i.e. diabetes, cardiovascular disease, cancer) and obesity. The project objective is to assess status and provide training and capacity development in the use of improved standardized methodologies for updated food composition data, dietary intake methods, use of suitable biomarkers of nutritional value and determine health outcomes in low and middle-income countries (LMIC). Training exchanges have been developed with clusters of countries created resulting from regional needs including Sudan, Egypt and Jordan; Tunisia, Morocco, and Mauritania; and other Middle Eastern countries. This capacity building will lead to the development and sustainability of up-to-date national and regional food composition databases in LMIC for use in dietary monitoring assessment in food and nutrient intakes. Workshops were organized to provide training and capacity development in the use of improved standardized methodologies for food composition and food intake. Training needs identified and short-term scientific missions organized for LMIC researchers including (1) training and knowledge exchange workshops, (2) short-term exchange of researchers, (3) development and application of protocols and (4) development of strategies to reduce sugar and fat intake. An initial training workshop, Morocco 2018 was attended by 25 participants from 10 EMR countries to review status and support development of regional food composition. 4 training exchanges are in progress. The use of improved standardized methodologies for food composition and dietary intake will produce robust measurements that will reinforce dietary monitoring and policy in LMIC. The capacity building from this project will lead to the development and sustainability of up-to-date national and regional food composition databases in EMR countries. Supported by the UK Medical Research Council, Global Challenges Research Fund, (MR/R019576/1), and the World Health Organization’s Eastern Mediterranean Region.

Keywords: dietary intake, food composition, low and middle-income countries, status.

Procedia PDF Downloads 131
1713 Cognitive Rehabilitation in Schizophrenia: A Review of the Indian Scenario

Authors: Garima Joshi, Pratap Sharan, V. Sreenivas, Nand Kumar, Kameshwar Prasad, Ashima N. Wadhawan

Abstract:

Schizophrenia is a debilitating disorder and is marked by cognitive impairment, which deleteriously impacts the social and professional functioning along with the quality of life of the patients and the caregivers. Often the cognitive symptoms are in their prodromal state and worsen as the illness progresses; they have proven to have a good predictive value for the prognosis of the illness. It has been shown that intensive cognitive rehabilitation (CR) leads to improvements in the healthy as well as cognitively-impaired subjects. As the majority of population in India falls in the lower to middle socio-economic status and have low education levels, using the existing packages, a majority of which are developed in the West, for cognitive rehabilitation becomes difficult. The use of technology is also restricted due to the high costs involved and the limited availability and familiarity with computers and other devices, which pose as an impedance for continued therapy. Cognitive rehabilitation in India uses a plethora of retraining methods for the patients with schizophrenia targeting the functions of attention, information processing, executive functions, learning and memory, and comprehension along with Social Cognition. Psychologists often have to follow an integrative therapy approach involving social skills training, family therapy and psychoeducation in order to maintain the gains from the cognitive rehabilitation in the long run. This paper reviews the methodologies and cognitive retaining programs used in India. It attempts to elucidate the evolution and development of methodologies used, from traditional paper-pencil based retraining to more sophisticated neuroscience-informed techniques in cognitive rehabilitation of deficits in schizophrenia as home-based or supervised and guided programs for cognitive rehabilitation.

Keywords: schizophrenia, cognitive rehabilitation, neuropsychological interventions, integrated approached to rehabilitation

Procedia PDF Downloads 343
1712 Stakeholder Mapping and Requirements Identification for Improving Traceability in the Halal Food Supply Chain

Authors: Laila A. H. F. Dashti, Tom Jackson, Andrew West, Lisa Jackson

Abstract:

Traceability systems are important in the agri-food and halal food sectors for monitoring ingredient movements, tracking sources, and ensuring food integrity. However, designing a traceability system for the halal food supply chain is challenging due to diverse stakeholder requirements and complex needs. Existing literature on stakeholder mapping and identifying requirements for halal food supply chains is limited. To address this gap, a pilot study was conducted to identify the objectives, requirements, and recommendations of stakeholders in the Kuwaiti halal food industry. The study collected data through semi-structured interviews with an international halal food manufacturer based in Kuwait. The aim was to gain a deep understanding of stakeholders' objectives, requirements, processes, and concerns related to the design of a traceability system in the country's halal food sector. Traceability systems are being developed and tested in the agri-food and halal food sectors due to their ability to monitor ingredient movements, track sources, and detect potential issues related to food integrity. Designing a traceability system for the halal food supply chain poses significant challenges due to diverse stakeholder requirements and the complexity of their needs (including varying food ingredients, different sources, destinations, supplier processes, certifications, etc.). Achieving a halal food traceability solution tailored to stakeholders' requirements within the supply chain necessitates prior knowledge of these needs. Although attempts have been made to address design-related issues in traceability systems, literature on stakeholder mapping and identification of requirements specific to halal food supply chains is scarce. Thus, this pilot study aims to identify the objectives, requirements, and recommendations of stakeholders in the halal food industry. The paper presents insights gained from the pilot study, which utilized semi-structured interviews to collect data from a Kuwait-based international halal food manufacturer. The objective was to gain an in-depth understanding of stakeholders' objectives, requirements, processes, and concerns pertaining to the design of a traceability system in Kuwait's halal food sector. The stakeholder mapping results revealed that government entities, food manufacturers, retailers, and suppliers are key stakeholders in Kuwait's halal food supply chain. Lessons learned from this pilot study regarding requirement capture for traceability systems include the need to streamline communication, focus on communication at each level of the supply chain, leverage innovative technologies to enhance process structuring and operations and reduce halal certification costs. The findings also emphasized the limitations of existing traceability solutions, such as limited cooperation and collaboration among stakeholders, high costs of implementing traceability systems without government support, lack of clarity regarding product routes, and disrupted communication channels between stakeholders. These findings contribute to a broader research program aimed at developing a stakeholder requirements framework that utilizes "business process modelling" to establish a unified model for traceable stakeholder requirements.

Keywords: supply chain, traceability system, halal food, stakeholders’ requirements

Procedia PDF Downloads 81
1711 Applying the CA Systems in Education Process

Authors: A. Javorova, M. Matusova, K. Velisek

Abstract:

The article summarizes the experience of laboratory technical subjects teaching methodologies using a number of software products. The main aim is to modernize the teaching process in accordance with the requirements of today - based on information technology. Increasing of the study attractiveness and effectiveness is due to the introduction of CA technologies in the learning process. This paper discussed the areas where individual CA system used. Environment using CA systems are briefly presented in each chapter.

Keywords: education, CA systems, simulation, technology

Procedia PDF Downloads 369
1710 Risk Assessment on New Bio-Composite Materials Made from Water Resource Recovery

Authors: Arianna Nativio, Zoran Kapelan, Jan Peter van der Hoek

Abstract:

Bio-composite materials are becoming increasingly popular in various applications, such as the automotive industry. Usually, bio-composite materials are made from natural resources recovered from plants, now, a new type of bio-composite material has begun to be produced in the Netherlands. This material is made from resources recovered from drinking water treatments (calcite), wastewater treatment (cellulose), and material from surface water management (aquatic plants). Surface water, raw drinking water, and wastewater can be contaminated with pathogens and chemical compounds. Therefore, it would be valuable to develop a framework to assess, monitor, and control the potential risks. Indeed, the goal is to define the major risks in terms of human health, quality of materials, and environment associated with the production and application of these new materials. This study describes the general risk assessment framework, starting with a qualitative risk assessment. The qualitative risk analysis was carried out by using the HAZOP methodology for the hazard identification phase. The HAZOP methodology is logical and structured and able to identify the hazards in the first stage of the design when hazards and associated risks are not well known. The identified hazards were analyzed to define the potential associated risks, and then these were evaluated by using the qualitative Event Tree Analysis. ETA is a logical methodology used to define the consequences for a specific hazardous incidents, evaluating the failure modes of safety barriers and dangerous intermediate events that lead to the final scenario (risk). This paper shows the effectiveness of combining of HAZOP and qualitative ETA methodologies for hazard identification and risk mapping. Then, key risks were identified, and a quantitative framework was developed based on the type of risks identified, such as QMRA and QCRA. These two models were applied to assess human health risks due to the presence of pathogens and chemical compounds such as heavy metals into the bio-composite materials. Thus, due to these contaminations, the bio-composite product, during its application, might release toxic substances into the environment leading to a negative environmental impact. Therefore, leaching tests are going to be planned to simulate the application of these materials into the environment and evaluate the potential leaching of inorganic substances, assessing environmental risk.

Keywords: bio-composite, risk assessment, water reuse, resource recovery

Procedia PDF Downloads 78
1709 3D Human Face Reconstruction in Unstable Conditions

Authors: Xiaoyuan Suo

Abstract:

3D object reconstruction is a broad research area within the computer vision field involving many stages and still open problems. One of the existing challenges in this field lies with micromotion, such as the facial expressions on the appearance of the human or animal face. Similar literatures in this field focuses on 3D reconstruction in stable conditions such as an existing image or photos taken in a rather static environment, while the purpose of this work is to discuss a flexible scan system using multiple cameras that can correctly reconstruct 3D stable and moving objects -- human face with expression in particular. Further, a mathematical model is proposed at the end of this literature to automate the 3D object reconstruction process. The reconstruction process takes several stages. Firstly, a set of simple 2D lines would be projected onto the object and hence a set of uneven curvy lines can be obtained, which represents the 3D numerical data of the surface. The lines and their shapes will help to identify object’s 3D construction in pixels. With the two-recorded angles and their distance from the camera, a simple mathematical calculation would give the resulting coordinate of each projected line in an absolute 3D space. This proposed research will benefit many practical areas, including but not limited to biometric identification, authentications, cybersecurity, preservation of cultural heritage, drama acting especially those with rapid and complex facial gestures, and many others. Specifically, this will (I) provide a brief survey of comparable techniques existing in this field. (II) discuss a set of specialized methodologies or algorithms for effective reconstruction of 3D objects. (III)implement, and testing the developed methodologies. (IV) verify findings with data collected from experiments. (V) conclude with lessons learned and final thoughts.

Keywords: 3D photogrammetry, 3D object reconstruction, facial expression recognition, facial recognition

Procedia PDF Downloads 126
1708 Climate Indices: A Key Element for Climate Change Adaptation and Ecosystem Forecasting - A Case Study for Alberta, Canada

Authors: Stefan W. Kienzle

Abstract:

The increasing number of occurrences of extreme weather and climate events have significant impacts on society and are the cause of continued and increasing loss of human and animal lives, loss or damage to property (houses, cars), and associated stresses to the public in coping with a changing climate. A climate index breaks down daily climate time series into meaningful derivatives, such as the annual number of frost days. Climate indices allow for the spatially consistent analysis of a wide range of climate-dependent variables, which enables the quantification and mapping of historical and future climate change across regions. As trends of phenomena such as the length of the growing season change differently in different hydro-climatological regions, mapping needs to be carried out at a high spatial resolution, such as the 10km by 10km Canadian Climate Grid, which has interpolated daily values from 1950 to 2017 for minimum and maximum temperature and precipitation. Climate indices form the basis for the analysis and comparison of means, extremes, trends, the quantification of changes, and their respective confidence levels. A total of 39 temperature indices and 16 precipitation indices were computed for the period 1951 to 2017 for the Province of Alberta. Temperature indices include the annual number of days with temperatures above or below certain threshold temperatures (0, +-10, +-20, +25, +30ºC), frost days, and timing of frost days, freeze-thaw days, growing or degree days, and energy demands for air conditioning and heating. Precipitation indices include daily and accumulated 3- and 5-day extremes, days with precipitation, period of days without precipitation, and snow and potential evapotranspiration. The rank-based nonparametric Mann-Kendall statistical test was used to determine the existence and significant levels of all associated trends. The slope of the trends was determined using the non-parametric Sen’s slope test. The Google mapping interface was developed to create the website albertaclimaterecords.com, from which beach of the 55 climate indices can be queried for any of the 6833 grid cells that make up Alberta. In addition to the climate indices, climate normals were calculated and mapped for four historical 30-year periods and one future period (1951-1980, 1961-1990, 1971-2000, 1981-2017, 2041-2070). While winters have warmed since the 1950s by between 4 - 5°C in the South and 6 - 7°C in the North, summers are showing the weakest warming during the same period, ranging from about 0.5 - 1.5°C. New agricultural opportunities exist in central regions where the number of heat units and growing degree days are increasing, and the number of frost days is decreasing. While the number of days below -20ºC has about halved across Alberta, the growing season has expanded by between two and five weeks since the 1950s. Interestingly, both the number of days with heat waves and cold spells have doubled to four-folded during the same period. This research demonstrates the enormous potential of using climate indices at the best regional spatial resolution possible to enable society to understand historical and future climate changes of their region.

Keywords: climate change, climate indices, habitat risk, regional, mapping, extremes

Procedia PDF Downloads 72
1707 Impact of Climate Change on Sea Level Rise along the Coastline of Mumbai City, India

Authors: Chakraborty Sudipta, A. R. Kambekar, Sarma Arnab

Abstract:

Sea-level rise being one of the most important impacts of anthropogenic induced climate change resulting from global warming and melting of icebergs at Arctic and Antarctic, the investigations done by various researchers both on Indian Coast and elsewhere during the last decade has been reviewed in this paper. The paper aims to ascertain the propensity of consistency of different suggested methods to predict the near-accurate future sea level rise along the coast of Mumbai. Case studies at East Coast, Southern Tip and West and South West coast of India have been reviewed. Coastal Vulnerability Index of several important international places has been compared, which matched with Intergovernmental Panel on Climate Change forecasts. The application of Geographic Information System mapping, use of remote sensing technology, both Multi Spectral Scanner and Thematic Mapping data from Landsat classified through Iterative Self-Organizing Data Analysis Technique for arriving at high, moderate and low Coastal Vulnerability Index at various important coastal cities have been observed. Instead of data driven, hindcast based forecast for Significant Wave Height, additional impact of sea level rise has been suggested. Efficacy and limitations of numerical methods vis-à-vis Artificial Neural Network has been assessed, importance of Root Mean Square error on numerical results is mentioned. Comparing between various computerized methods on forecast results obtained from MIKE 21 has been opined to be more reliable than Delft 3D model.

Keywords: climate change, Coastal Vulnerability Index, global warming, sea level rise

Procedia PDF Downloads 112
1706 On the Existence of Homotopic Mapping Between Knowledge Graphs and Graph Embeddings

Authors: Jude K. Safo

Abstract:

Knowledge Graphs KG) and their relation to Graph Embeddings (GE) represent a unique data structure in the landscape of machine learning (relative to image, text and acoustic data). Unlike the latter, GEs are the only data structure sufficient for representing hierarchically dense, semantic information needed for use-cases like supply chain data and protein folding where the search space exceeds the limits traditional search methods (e.g. page-rank, Dijkstra, etc.). While GEs are effective for compressing low rank tensor data, at scale, they begin to introduce a new problem of ’data retreival’ which we observe in Large Language Models. Notable attempts by transE, TransR and other prominent industry standards have shown a peak performance just north of 57% on WN18 and FB15K benchmarks, insufficient practical industry applications. They’re also limited, in scope, to next node/link predictions. Traditional linear methods like Tucker, CP, PARAFAC and CANDECOMP quickly hit memory limits on tensors exceeding 6.4 million nodes. This paper outlines a topological framework for linear mapping between concepts in KG space and GE space that preserve cardinality. Most importantly we introduce a traceable framework for composing dense linguistic strcutures. We demonstrate performance on WN18 benchmark this model hits. This model does not rely on Large Langauge Models (LLM) though the applications are certainy relevant here as well.

Keywords: representation theory, large language models, graph embeddings, applied algebraic topology, applied knot theory, combinatorics

Procedia PDF Downloads 47
1705 Flood Devastation Assessment Through Mapping in Nigeria-2022 using Geospatial Techniques

Authors: Hafiz Muhammad Tayyab Bhatti, Munazza Usmani

Abstract:

One of nature's most destructive occurrences, floods do immense damage to communities and economic losses. Nigeria country, specifically southern Nigeria, is known for being prone to flooding. Even though periodic flooding occurs in Nigeria frequently, the floods of 2022 were the worst since those in 2012. Flood vulnerability analysis and mapping are still lacking in this region due to the very limited historical hydrological measurements and surveys on the effects of floods, which makes it difficult to develop and put into practice efficient flood protection measures. Remote sensing and Geographic Information Systems (GIS) are useful approaches to detecting, determining, and estimating the flood extent and its impacts. In this study, NOAA VIIR has been used to extract the flood extent using the flood water fraction data and afterward fused with GIS data for some zonal statistical analysis. The estimated possible flooding areas are validated using satellite imagery from the Moderate Resolution Imaging Spectroradiometer (MODIS). The goal is to map and studied flood extent, flood hazards, and their effects on the population, schools, and health facilities for each state of Nigeria. The resulting flood hazard maps show areas with high-risk levels clearly and serve as an important reference for planning and implementing future flood mitigation and control strategies. Overall, the study demonstrated the viability of using the chosen GIS and remote sensing approaches to detect possible risk regions to secure local populations and enhance disaster response capabilities during natural disasters.

Keywords: flood hazards, remote sensing, damage assessment, GIS, geospatial analysis

Procedia PDF Downloads 97
1704 Effective Energy Saving of a Large Building through Multiple Approaches

Authors: Choo Hong Ang

Abstract:

The most popular approach to save energy for large commercial buildings in Malaysia is to replace the existing chiller plant of high kW/ton to one of lower kW/ton. This approach, however, entails large capital outlay with a long payment period of up to 7 years. This paper shows that by using multiple approaches, other than replacing the existing chiller plant, an energy saving of up to 20 %, is possible. The main methodology adopted was to identify and then plugged all heat ingress paths into a building, including putting up glass structures to prevent mixing of internal air-conditioned air with the ambient environment, and replacing air curtains with glass doors. This methodology could save up to 10 % energy bill. Another methodology was to change fixed speed motors of air handling units (AHU) to variable speed drive (VSD) and changing escalators to motion-sensor type. Other methodologies included reducing heat load by blocking air supply to non-occupied parcels, rescheduling chiller plant operation, changing of fluorescent lights to LED lights, and conversion from tariff B to C1. A case example of Komtar, the tallest building in Penang, is given here. The total energy bill for Komtar was USD2,303,341 in 2016 but was reduced to USD 1,842,927.39 in 2018, a significant saving of USD460,413.86 or 20 %. In terms of kWh, there was a reduction from 18, 302,204.00 kWh in 2016 to 14,877,105.00 kWh in 2018, a reduction of 3,425,099.00 kWh or 18.71 %. These methodologies used were relatively low cost and the payback period was merely 24 months. With this achievement, the Komtar building was awarded champion of the Malaysian National Energy Award 2019 and second runner up of the Asean Energy Award. This experience shows that a strong commitment to energy saving is the key to effective energy saving.

Keywords: chiller plant, energy saving measures, heat ingress, large building

Procedia PDF Downloads 82
1703 Research Progress of the Relationship between Urban Rail Transit and Residents' Travel Behavior during 1999-2019: A Scientific Knowledge Mapping Based on Citespace and Vosviewer

Authors: Zheng Yi

Abstract:

Among the attempts made worldwide to foster urban and transport sustainability, transit-oriented development certainly is one of the most successful. Residents' travel behavior is a concern in the researches about the impacts of transit-oriented development. The study takes 620 English journal papers in the core collection database of Web of Science as the study objects; the paper tries to map out the scientific knowledge mapping in the field and draw the basic conditions by co-citation analysis, co-word analysis, a total of citation network analysis and visualization techniques. This study teases out the research hotspots and evolution of the relationship between urban rail transit and resident's travel behavior from 1999 to 2019. According to the results of the analysis of the time-zone view and burst-detection, the paper discusses the trend of the next stage of international study. The results show that in the past 20 years, the research focuses on these keywords: land use, behavior, model, built environment, impact, travel behavior, walking, physical activity, smart card, big data, simulation, perception. According to different research contents, the key literature is further divided into these topics: the attributes of the built environment, land use, transportation network, transportation policies. The results of this paper can help to understand the related researches and achievements systematically. These results can also provide a reference for identifying the main challenges that relevant researches need to address in the future.

Keywords: urban rail transit, travel behavior, knowledge map, evolution of researches

Procedia PDF Downloads 94
1702 Structural Characterization of the 3D Printed Silicon Carbon/Carbon Fibers Nanocomposites

Authors: Saja M. Nabat Al-Ajrash, Charles Browning, Rose Eckerle, Li Cao

Abstract:

A process that utilizes a combination of additive manufacturing (AM), a preceramic polymer, and a chopped carbon fiber precursorto fabricate Silicon Carbon/ Carbon fibers (SiC/C) composites have been developed. The study has shown a promising, cost-effective, and efficient route to fabricate complex SiC/C composites using additive manufacturing. A key part of this effort was the mapping of the material’s microstructure through the thickness of the composite. Microstructural features in the pyrolyzed composites through the successive AM layers, such as defects, crystal size and their distribution, interatomic spacing, chemical bonds, were investigated using high-resolution scanning and transmission electron microscopy. As a result, the microstructure developed in SiC/C composites after printing, cure, and pyrolysis has been successfully mapped through the thickness of the derived composites. Dense and nearly defect-free parts after polymer to ceramic conversion were observed. The ceramic matrix composite displayed three coexisting phases, including silicon carbide, silicon oxycarbide, and turbostratic carbon. Lattice fringes imaging and X-Ray Diffraction analysis showed well-defined SiC and turbostratic carbon features. The cross-sectional mapping of the printed-then-pyrolyzed structures has confirmed consistent structural and chemical features within the internal layers of the AM parts. Noteworthy, however, is that a crust-like area with high crystallinity has been observed in the first and last external layers. Not only do these crust-like regions have structural characteristics distinct from the internal layers, but they also have elemental distributions different than the internal layers.

Keywords: SiC, preceramic polymer, additive manufacturing, ceramic

Procedia PDF Downloads 53