Search results for: modern analytical methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18979

Search results for: modern analytical methods

8929 Advances and Challenges in Assessing Students’ Learning Competencies in 21st Century Higher Education

Authors: O. Zlatkin-Troitschanskaia, J. Fischer, C. Lautenbach, H. A. Pant

Abstract:

In 21st century higher education (HE), the diversity among students has increased in recent years due to the internationalization and higher mobility. Offering and providing equal and fair opportunities based on students’ individual skills and abilities instead of their social or cultural background is one of the major aims of HE. In this context, valid, objective and transparent assessments of students’ preconditions and academic competencies in HE are required. However, as analyses of the current states of research and practice show, a substantial research gap on assessment practices in HE still exists, calling for the development of effective solutions. These demands lead to significant conceptual and methodological challenges. Funded by the German Federal Ministry of Education and Research, the research program 'Modeling and Measuring Competencies in Higher Education – Validation and Methodological Challenges' (KoKoHs) focusses on addressing these challenges in HE assessment practice by modeling and validating objective test instruments. Including 16 cross-university collaborative projects, the German-wide research program contributes to bridging the research gap in current assessment research and practice by concentrating on practical and policy-related challenges of assessment in HE. In this paper, we present a differentiated overview of existing assessments of HE at the national and international level. Based on the state of research, we describe the theoretical and conceptual framework of the KoKoHs Program as well as results of the validation studies, including their key outcomes. More precisely, this includes an insight into more than 40 developed assessments covering a broad range of transparent and objective methods for validly measuring domain-specific and generic knowledge and skills for five major study areas (Economics, Social Science, Teacher Education, Medicine and Psychology). Computer-, video- and simulation-based instruments have been applied and validated to measure over 20,000 students at the beginning, middle and end of their (bachelor and master) studies at more than 300 HE institutions throughout Germany or during their practical training phase, traineeship or occupation. Focussing on the validity of the assessments, all test instruments have been analyzed comprehensively, using a broad range of methods and observing the validity criteria of the Standards for Psychological and Educational Testing developed by the American Educational Research Association, the American Economic Association and the National Council on Measurement. The results of the developed assessments presented in this paper, provide valuable outcomes to predict students’ skills and abilities at the beginning and the end of their studies as well as their learning development and performance. This allows for a differentiated view of the diversity among students. Based on the given research results practical implications and recommendations are formulated. In particular, appropriate and effective learning opportunities for students can be created to support the learning development of students, promote their individual potential and reduce knowledge and skill gaps. Overall, the presented research on competency assessment is highly relevant to national and international HE practice.

Keywords: 21st century skills, academic competencies, innovative assessments, KoKoHs

Procedia PDF Downloads 136
8928 A Grounded Theory on Marist Spirituality/Charism from the Perspective of the Lay Marists in the Philippines

Authors: Nino M. Pizarro

Abstract:

To the author’s knowledge, despite the written documents about Marist spirituality/charism, nothing has been done concerning a clear theoretical framework that highlights Marist spirituality/charism from the perspective or lived experience of the lay Marists of St. Marcellin Champagnat. The participants of the study are the lay Marist - educators who are from Marist Schools in the Philippines. Since the study would like to find out the respondents’ own concepts and meanings about Marist spirituality/charism, qualitative methodology is considered the approach to be used in the study. In particular, the study will use the qualitative methods of Barney Glaser. The theory will be generated systematically from data collection, coding and analyzing through memoing, theoretical sampling, sorting and writing and using the constant comparative method. The data collection method that will be employed in this grounded theory research is the in-depth interview that is semi-structured and participant driven. Data collection will be done through snowball sampling that is purposive. The study is considering to come up with a theoretical framework that will help the lay Marists to deepen their understanding of the Marist spirituality/charism and their vocation as lay partners of the Marist Brothers of the Schools.

Keywords: grounded theory, Lay Marists, lived experience, Marist spirituality/charism

Procedia PDF Downloads 307
8927 Applying Massively Parallel Sequencing to Forensic Soil Bacterial Profiling

Authors: Hui Li, Xueying Zhao, Ke Ma, Yu Cao, Fan Yang, Qingwen Xu, Wenbin Liu

Abstract:

Soil can often link a person or item to a crime scene, which makes it a valuable evidence in forensic casework. Several techniques have been utilized in forensic soil discrimination in previous studies. Because soil contains a vast number of microbiomes, the analyse of soil microbiomes is expected to be a potential way to characterise soil evidence. In this study, we applied massively parallel sequencing (MPS) to soil bacterial profiling on the Ion Torrent Personal Genome Machine (PGM). Soils from different regions were collected repeatedly. V-region 3 and 4 of Bacterial 16S rRNA gene were detected by MPS. Operational taxonomic units (OTU, 97%) were used to analyse soil bacteria. Several bioinformatics methods (PCoA, NMDS, Metastats, LEfse, and Heatmap) were applied in bacterial profiles. Our results demonstrate that MPS can provide a more detailed picture of the soil microbiomes and the composition of soil bacterial components from different region was individualistic. In conclusion, the utility of soil bacterial profiling via MPS of the 16S rRNA gene has potential value in characterising soil evidences and associating them with their place of origin, which can play an important role in forensic science in the future.

Keywords: bacterial profiling, forensic, massively parallel sequencing, soil evidence

Procedia PDF Downloads 557
8926 Avian Ecological Status in the Gadaïne Eco-Complex (Batna, NE Algeria)

Authors: Marref Cherine, Bezzala Adel, Houhamdimoussa

Abstract:

Wetlands represent ecosystems of great importance through their ecological and socio-economic functions and biological diversity, even if they are most threatened by anthropization. This study aimed to contribute to the creation of an inventory of bird species in the Gadaïne eco-complex (Batna, Algeria) from 2019 to 2021. Counts were carried out from 8:00 to 19:00 using a telescope (20 × 60) and a pair of binoculars (10 × 50) and by employing absolute and relative methods. Birds were categorized by phenology, habitat, biogeography, and diet. A total of 80 species in 58 genera and 19 families were observed. Migratory birds were dominant (38%) phenologically, and the birds of Palearctic origin dominated (26.25%) biogeographically. Invertivorous and carnivorous species were the most common (35%). Ecologically, the majority of species were waterbirds (73.75%), which are protected in Algeria. This study highlights the need for the preservation of ecosystem components and the enhancement of biological resources of protected, rare, and key species. We observed 43797 individuals of Marmaronetta angustirostris during our study and reported the nesting of Podiceps nigricollis, Porphyrio porphyrio, and Tadorna ferruginea. For this reason, it is recommended to propose the area as a Ramsar site.

Keywords: biodiversity, avifauna, ecological status, wetlands

Procedia PDF Downloads 58
8925 OpenFOAM Based Simulation of High Reynolds Number Separated Flows Using Bridging Method of Turbulence

Authors: Sagar Saroha, Sawan S. Sinha, Sunil Lakshmipathy

Abstract:

Reynolds averaged Navier-Stokes (RANS) model is the popular computational tool for prediction of turbulent flows. Being computationally less expensive as compared to direct numerical simulation (DNS), RANS has received wide acceptance in industry and research community as well. However, for high Reynolds number flows, the traditional RANS approach based on the Boussinesq hypothesis is incapacitated to capture all the essential flow characteristics, and thus, its performance is restricted in high Reynolds number flows of practical interest. RANS performance turns out to be inadequate in regimes like flow over curved surfaces, flows with rapid changes in the mean strain rate, duct flows involving secondary streamlines and three-dimensional separated flows. In the recent decade, partially averaged Navier-Stokes (PANS) methodology has gained acceptability among seamless bridging methods of turbulence- placed between DNS and RANS. PANS methodology, being a scale resolving bridging method, is inherently more suitable than RANS for simulating turbulent flows. The superior ability of PANS method has been demonstrated for some cases like swirling flows, high-speed mixing environment, and high Reynolds number turbulent flows. In our work, we intend to evaluate PANS in case of separated turbulent flows past bluff bodies -which is of broad aerodynamic research and industrial application. PANS equations, being derived from base RANS, continue to inherit the inadequacies from the parent RANS model based on linear eddy-viscosity model (LEVM) closure. To enhance PANS’ capabilities for simulating separated flows, the shortcomings of the LEVM closure need to be addressed. Inabilities of the LEVMs have inspired the development of non-linear eddy viscosity models (NLEVM). To explore the potential improvement in PANS performance, in our study we evaluate the PANS behavior in conjugation with NLEVM. Our work can be categorized into three significant steps: (i) Extraction of PANS version of NLEVM from RANS model, (ii) testing the model in the homogeneous turbulence environment and (iii) application and evaluation of the model in the canonical case of separated non-homogeneous flow field (flow past prismatic bodies and bodies of revolution at high Reynolds number). PANS version of NLEVM shall be derived and implemented in OpenFOAM -an open source solver. Homogeneous flows evaluation will comprise the study of the influence of the PANS’ filter-width control parameter on the turbulent stresses; the homogeneous analysis performed over typical velocity fields and asymptotic analysis of Reynolds stress tensor. Non-homogeneous flow case will include the study of mean integrated quantities and various instantaneous flow field features including wake structures. Performance of PANS + NLEVM shall be compared against the LEVM based PANS and LEVM based RANS. This assessment will contribute to significant improvement of the predictive ability of the computational fluid dynamics (CFD) tools in massively separated turbulent flows past bluff bodies.

Keywords: bridging methods of turbulence, high Re-CFD, non-linear PANS, separated turbulent flows

Procedia PDF Downloads 144
8924 Embedded Test Framework: A Solution Accelerator for Embedded Hardware Testing

Authors: Arjun Kumar Rath, Titus Dhanasingh

Abstract:

Embedded product development requires software to test hardware functionality during development and finding issues during manufacturing in larger quantities. As the components are getting integrated, the devices are tested for their full functionality using advanced software tools. Benchmarking tools are used to measure and compare the performance of product features. At present, these tests are based on a variety of methods involving varying hardware and software platforms. Typically, these tests are custom built for every product and remain unusable for other variants. A majority of the tests goes undocumented, not updated, unusable when the product is released. To bridge this gap, a solution accelerator in the form of a framework can address these issues for running all these tests from one place, using an off-the-shelf tests library in a continuous integration environment. There are many open-source test frameworks or tools (fuego. LAVA, AutoTest, KernelCI, etc.) designed for testing embedded system devices, with each one having several unique good features, but one single tool and framework may not satisfy all of the testing needs for embedded systems, thus an extensible framework with the multitude of tools. Embedded product testing includes board bring-up testing, test during manufacturing, firmware testing, application testing, and assembly testing. Traditional test methods include developing test libraries and support components for every new hardware platform that belongs to the same domain with identical hardware architecture. This approach will have drawbacks like non-reusability where platform-specific libraries cannot be reused, need to maintain source infrastructure for individual hardware platforms, and most importantly, time is taken to re-develop test cases for new hardware platforms. These limitations create challenges like environment set up for testing, scalability, and maintenance. A desirable strategy is certainly one that is focused on maximizing reusability, continuous integration, and leveraging artifacts across the complete development cycle during phases of testing and across family of products. To get over the stated challenges with the conventional method and offers benefits of embedded testing, an embedded test framework (ETF), a solution accelerator, is designed, which can be deployed in embedded system-related products with minimal customizations and maintenance to accelerate the hardware testing. Embedded test framework supports testing different hardwares including microprocessor and microcontroller. It offers benefits such as (1) Time-to-Market: Accelerates board brings up time with prepacked test suites supporting all necessary peripherals which can speed up the design and development stage(board bring up, manufacturing and device driver) (2) Reusability-framework components isolated from the platform-specific HW initialization and configuration makes the adaptability of test cases across various platform quick and simple (3) Effective build and test infrastructure with multiple test interface options and preintegrated with FUEGO framework (4) Continuos integration - pre-integrated with Jenkins which enabled continuous testing and automated software update feature. Applying the embedded test framework accelerator throughout the design and development phase enables to development of the well-tested systems before functional verification and improves time to market to a large extent.

Keywords: board diagnostics software, embedded system, hardware testing, test frameworks

Procedia PDF Downloads 135
8923 Incorporating Multiple Supervised Learning Algorithms for Effective Intrusion Detection

Authors: Umar Albalawi, Sang C. Suh, Jinoh Kim

Abstract:

As internet continues to expand its usage with an enormous number of applications, cyber-threats have significantly increased accordingly. Thus, accurate detection of malicious traffic in a timely manner is a critical concern in today’s Internet for security. One approach for intrusion detection is to use Machine Learning (ML) techniques. Several methods based on ML algorithms have been introduced over the past years, but they are largely limited in terms of detection accuracy and/or time and space complexity to run. In this work, we present a novel method for intrusion detection that incorporates a set of supervised learning algorithms. The proposed technique provides high accuracy and outperforms existing techniques that simply utilizes a single learning method. In addition, our technique relies on partial flow information (rather than full information) for detection, and thus, it is light-weight and desirable for online operations with the property of early identification. With the mid-Atlantic CCDC intrusion dataset publicly available, we show that our proposed technique yields a high degree of detection rate over 99% with a very low false alarm rate (0.4%).

Keywords: intrusion detection, supervised learning, traffic classification, computer networks

Procedia PDF Downloads 345
8922 Harnessing the Potential of Renewable Energy Sources to Reduce Fossil Energy Consumption in the Wastewater Treatment Process

Authors: Hen Friman

Abstract:

Various categories of aqueous solutions are discharged within residential, institutional, commercial, and industrial structures. To safeguard public health and preserve the environment, it is imperative to subject wastewater to treatment processes that eliminate pathogens (such as bacteria and viruses), nutrients (such as nitrogen and phosphorus), and other compounds. Failure to address untreated sewage accumulation can result in an array of adverse consequences. Israel exemplifies a special case in wastewater management. Appropriate wastewater treatment significantly benefits sectors such as agriculture, tourism, horticulture, and industry. Nevertheless, untreated sewage in settlements lacking proper sewage collection or transportation networks remains an ongoing and substantial threat. Notably, the process of wastewater treatment entails substantial energy consumption. Consequently, this study explores the integration of solar energy as a renewable power source within the wastewater treatment framework. By incorporating renewable energy sources into the process, costs can be minimized, and decentralized facilities can be established even in areas lacking adequate infrastructure for traditional treatment methods.

Keywords: renewable energy, solar energy, innovative, wastewater treatment

Procedia PDF Downloads 104
8921 Unspoken Delights: Creative Strategies for Bypass Censorship System and Depicting Male-Female Relationships in Iranian Cinema

Authors: Parsa Naji

Abstract:

Following the Iran Islamic Revolution in 1979 and the subsequent formation of a theocratic regime, the new regime implemented stringent regulations and a complicated censorship system in the film industry. Thereupon, the screening of films showing the relationships between males and females encountered numerous limitations. Not only did these limits encompass the physical portrayal of the relationship between males and females, but also the dialogues containing explicit sexual or even passionate romantic themes, resulting in a film being permanently consigned to archival storage. However, despite these limitations, Iranian filmmakers persevered in creating their interesting cinematic works. Throughout the years after the revolution, Iranian directors have navigated a series of challenges and obstacles, employing innovative and unconventional methods to bypass the rigorous censorship system imposed by the government, ensuring the screening of their films. This study aims to analyze the creative approaches employed by Iranian filmmakers to circumvent governmental censorship regulations.

Keywords: censorship, Iranian cinema, Islamic revolution, male-female relationship

Procedia PDF Downloads 38
8920 Effect of Powder Shape on Physical Properties of Porous Coatings

Authors: M. Moayeri, A. Kaflou

Abstract:

Decreasing the size of heat exchangers in industries is favorable due to a reduction in the initial costs and maintenance. This can be achieved generally by increasing the heat transfer coefficient, which can be done by increasing tube surface by passive methods named “porous coat”. Since these coatings are often in contact with the fluid, mechanical strength of coatings should be considered as main concept beside permeability and porosity in design, especially in high velocity services. Powder shape affected mechanical property more than other factors. So in this study, the Copper powder with three different shapes (spherical, dendritic and irregular) was coated on Cu-Ni base metal with thickness of ~300µm in a reduction atmosphere (5% H2-N2) and programmable furnace. The morphology and physical properties of coatings, such as porosity, permeability and mechanical strength were investigated. Results show although irregular particle have maximum porosity and permeability but strength level close to spherical powder, in addition, mentioned particle has low production cost, so for creating porous coats in high velocity services these powder recommended.

Keywords: porous coat, permeability, mechanical strength, porosity

Procedia PDF Downloads 352
8919 Tuning Fractional Order Proportional-Integral-Derivative Controller Using Hybrid Genetic Algorithm Particle Swarm and Differential Evolution Optimization Methods for Automatic Voltage Regulator System

Authors: Fouzi Aboura

Abstract:

The fractional order proportional-integral-derivative (FOPID) controller or fractional order (PIλDµ) is a proportional-integral-derivative (PID) controller where integral order (λ) and derivative order (µ) are fractional, one of the important application of classical PID is the Automatic Voltage Regulator (AVR).The FOPID controller needs five parameters optimization while the design of conventional PID controller needs only three parameters to be optimized. In our paper we have proposed a comparison between algorithms Differential Evolution (DE) and Hybrid Genetic Algorithm Particle Swarm Optimization (HGAPSO) ,we have studied theirs characteristics and performance analysis to find an optimum parameters of the FOPID controller, a new objective function is also proposed to take into account the relation between the performance criteria’s.

Keywords: FOPID controller, fractional order, AVR system, objective function, optimization, GA, PSO, HGAPSO

Procedia PDF Downloads 87
8918 Wave Propagation In Functionally Graded Lattice Structures Under Impact Loads

Authors: Mahmood Heshmati, Farhang Daneshmand

Abstract:

Material scientists and engineers have introduced novel materials with complex geometries due to the recent technological advances and promotion of manufacturing methods. Among them, lattice structures with graded architectures denoted by functionally graded porous materials (FGPMs) have been developed to optimize the structural response. FGPMs are achieved by tailoring the size and density of the internal pores in one or more directions that lead to the desired mechanical properties and structural responses. Also, FGPMs provide more flexible transition and the possibility of designing and fabricating structural elements with complex and variable properties. In this paper, wave propagation in lattice structures with functionally graded (FG) porosity is investigated in order to examine the ability of shock absorbing effect. The behavior of FG porous beams with different porosity distributions under impact load and the effects of porosity distribution and porosity content on the wave speed are studied. Important conclusions are made, along with a discussion of the future scope of studies on FGPMs structures.

Keywords: functionally graded, porous materials, wave propagation, impact load, finite element

Procedia PDF Downloads 79
8917 Discrete Group Search Optimizer for the Travelling Salesman Problem

Authors: Raed Alnajjar, Mohd Zakree, Ahmad Nazri

Abstract:

In this study, we apply Discrete Group Search Optimizer (DGSO) for solving Traveling Salesman Problem (TSP). The DGSO is a nature inspired optimization algorithm that imitates the animal behavior, especially animal searching behavior. The proposed DGSO uses a vector representation and some discrete operators, such as destruction, construction, differential evolution, swap and insert. The TSP is a well-known hard combinatorial optimization problem, which seeks to find the shortest path among numbers of cities. The performance of the proposed DGSO is evaluated and tested on benchmark instances which listed in LIBTSP dataset. The experimental results show that the performance of the proposed DGSO is comparable with the other methods in the state of the art for some instances. The results show that DGSO outperform Ant Colony System (ACS) in some instances whilst outperform other metaheuristic in most instances. In addition to that, the new results obtained a number of optimal solutions and some best known results. DGSO was able to obtain feasible and good quality solution across all dataset.

Keywords: discrete group search optimizer (DGSO); Travelling salesman problem (TSP); Variable neighborhood search(VNS)

Procedia PDF Downloads 320
8916 The Use of Mnemonic and Mathematical Mnemonic Method in Improving Historical Understanding

Authors: Lee Bih Ni, Nurul Asyikin Binti Hassan

Abstract:

This paper discusses the use of mnemonic and mathematical methods in enhancing the understanding of history. Mnemonics can help students from all levels including high school and in various disciplines including language, math and history. At the secondary level, students are exposed to various courses that require them to remember many facts that can be mastered through the application of mnemonic techniques. Researchers use narrative literature studies to illustrate the current state of art and science in the field of research focused. Researchers used narrative literature reviews to build a scientific base of knowledge. Researchers gather all the key points in the discussion, and put it here by referring to the specific field where the paper is essentially based. The findings suggest that the use of mnemonic techniques can improve the individual's memory by adding little effort. In implementing mnemonic techniques, it is important to integrate mathematics and history in the course as both are interconnected as mathematics has shaped our history and vice versa. This study shows that memory skills can actually be improved; the human mind can remember something more than expected.

Keywords: cognitive strategy, mnemonic technique, secondary school level study, mathematical mnemonic

Procedia PDF Downloads 130
8915 Removal of Heavy Metal Using Continous Mode

Authors: M. Abd elfattah, M. Ossman, Nahla A. Taha

Abstract:

The present work explored the use of Egyptian rice straw, an agricultural waste that leads to global warming problem through brown cloud, as a potential feedstock for the preparation of activated carbon by physical and chemical activation. The results of this study showed that it is feasible to prepare activated carbons with relatively high surface areas and pore volumes from the Egyptian rice straw by direct chemical and physical activation. The produced activated carbon from the two methods (AC1 and AC2) could be used as potential adsorbent for the removal of Fe(III) from aqueous solution contains heavy metals and polluted water. The adsorption of Fe(III) was depended on the pH of the solution. The optimal Fe(III) removal efficiency occurs at pH 5. Based on the results, the optimum contact time is 60 minutes and adsorbent dosage is 3 g/L. The adsorption breakthrough curves obtained at different bed depths indicated increase of breakthrough time with increase in bed depths. A rise in inlet Fe(III) concentration reduces the throughput volume before the packed bed gets saturated. AC1 showed higher affinity for Fe(III) as compared to Raw rice husk.

Keywords: rice straw, activated carbon, Fe(III), fixed bed column, pyrolysis

Procedia PDF Downloads 246
8914 Songkran Tradition: An Invented Tradition of Thai Buddhists and Thai Muslims for Peace and Happiness in Southern Thailand

Authors: Utit Sungkharat

Abstract:

Purpose: To investigate an invented tradition of Thai Buddhists and Thai Muslims for peace. Methods: The data for this qualitative research were collected from related documents and research reports, field data, and in-depth interviews with Buddhist and Muslim religious leaders and people in the community. Results: The results of the research revealed that Thai Buddhists and Thai Muslims in Tamod Community in the Southern part of Thailand who have lived in the same community and shared the same history of the community jointly invented the Songkran tradition holding on to the reason that they have lived in the same community founded by the same person. The reason for inventing this tradition is that Songkran is a tradition for paying respect to ancestors who passed away and people in Tamod have the same ancestor even though they believe in different religions. Therefore, paying respect to the ancestors can be performed together by people of the two religions. The invented tradition has not only united them and empowered them to drive their community to development but also brought peace and happiness to this community.

Keywords: invented tradition, Thai Buddhists, Thai Muslims, peace

Procedia PDF Downloads 344
8913 Urban Flood Resilience Comprehensive Assessment of "720" Rainstorm in Zhengzhou Based on Multiple Factors

Authors: Meiyan Gao, Zongmin Wang, Haibo Yang, Qiuhua Liang

Abstract:

Under the background of global climate change and rapid development of modern urbanization, the frequency of climate disasters such as extreme precipitation in cities around the world is gradually increasing. In this paper, Hi-PIMS model is used to simulate the "720" flood in Zhengzhou, and the continuous stages of flood resilience are determined with the urban flood stages are divided. The flood resilience curve under the influence of multiple factors were determined and the urban flood toughness was evaluated by combining the results of resilience curves. The flood resilience of urban unit grid was evaluated based on economy, population, road network, hospital distribution and land use type. Firstly, the rainfall data of meteorological stations near Zhengzhou and the remote sensing rainfall data from July 17 to 22, 2021 were collected. The Kriging interpolation method was used to expand the rainfall data of Zhengzhou. According to the rainfall data, the flood process generated by four rainfall events in Zhengzhou was reproduced. Based on the results of the inundation range and inundation depth in different areas, the flood process was divided into four stages: absorption, resistance, overload and recovery based on the once in 50 years rainfall standard. At the same time, based on the levels of slope, GDP, population, hospital affected area, land use type, road network density and other aspects, the resilience curve was applied to evaluate the urban flood resilience of different regional units, and the difference of flood process of different precipitation in "720" rainstorm in Zhengzhou was analyzed. Faced with more than 1,000 years of rainstorm, most areas are quickly entering the stage of overload. The influence levels of factors in different areas are different, some areas with ramps or higher terrain have better resilience, and restore normal social order faster, that is, the recovery stage needs shorter time. Some low-lying areas or special terrain, such as tunnels, will enter the overload stage faster in the case of heavy rainfall. As a result, high levels of flood protection, water level warning systems and faster emergency response are needed in areas with low resilience and high risk. The building density of built-up area, population of densely populated area and road network density all have a certain negative impact on urban flood resistance, and the positive impact of slope on flood resilience is also very obvious. While hospitals can have positive effects on medical treatment, they also have negative effects such as population density and asset density when they encounter floods. The result of a separate comparison of the unit grid of hospitals shows that the resilience of hospitals in the distribution range is low when they encounter floods. Therefore, in addition to improving the flood resistance capacity of cities, through reasonable planning can also increase the flood response capacity of cities. Changes in these influencing factors can further improve urban flood resilience, such as raise design standards and the temporary water storage area when floods occur, train the response speed of emergency personnel and adjust emergency support equipment.

Keywords: urban flood resilience, resilience assessment, hydrodynamic model, resilience curve

Procedia PDF Downloads 37
8912 Aflatoxin Contamination of Abattoir Wastes in Ogun State, Nigeria

Authors: A. F. Gbadebo, O. O. Atanda, M. C. Adetunji

Abstract:

The study investigated the level of aflatoxin contamination of abattoir wastes in Ogun State, Nigeria, due to continued complaints of poor hygiene of abattoir centers in the states as a result of improper disposal of abattoir wastes. Wastes from the three senatorial districts of the state were evaluated for their levels of aflatoxin contamination. The moisture content, total plate count, fungal counts, percentage frequency of fungal occurrence as well as the level of aflatoxin contamination of the abattoir wastes were determined by standard methods. The moisture content of the wastes ranged between 79.10-87.46 %, total plate count from 1.37-3.27×10³cfu/ml, and fungal counts from 2.73-3.30×10²cfu/ml. Four fungal species: Aspergillus niger, Aspergillus flavus, Aspergillus ochraceus, and Penicillium citrinum were isolated from the wastes, with Aspergillus flavus having the highest percentage frequency of occurrence of 29.76%. The aflatoxin content of the samples was found to range between 3.20-4.80 µg/kg. These findings showed that abattoir wastes from Ogun State are contaminated with aflatoxins and pose a health risk to humans and animals.

Keywords: abattoir wastes, aflatoxin, microbial load, Ogun state

Procedia PDF Downloads 131
8911 Protection of Cultural Heritage against the Effects of Climate Change Using Autonomous Aerial Systems Combined with Automated Decision Support

Authors: Artur Krukowski, Emmanouela Vogiatzaki

Abstract:

The article presents an ongoing work in research projects such as SCAN4RECO or ARCH, both funded by the European Commission under Horizon 2020 program. The former one concerns multimodal and multispectral scanning of Cultural Heritage assets for their digitization and conservation via spatiotemporal reconstruction and 3D printing, while the latter one aims to better preserve areas of cultural heritage from hazards and risks. It co-creates tools that would help pilot cities to save cultural heritage from the effects of climate change. It develops a disaster risk management framework for assessing and improving the resilience of historic areas to climate change and natural hazards. Tools and methodologies are designed for local authorities and practitioners, urban population, as well as national and international expert communities, aiding authorities in knowledge-aware decision making. In this article we focus on 3D modelling of object geometry using primarily photogrammetric methods to achieve very high model accuracy using consumer types of devices, attractive both to professions and hobbyists alike.

Keywords: 3D modelling, UAS, cultural heritage, preservation

Procedia PDF Downloads 119
8910 Review of Research on Waste Plastic Modified Asphalt

Authors: Song Xinze, Cai Kejian

Abstract:

To further explore the application of waste plastics in asphalt pavement, this paper begins with the classification and characteristics of waste plastics. It then provides a state-of-the-art review of the preparation methods and processes of waste plastic modifiers, waste plastic-modified asphalt, and waste plastic-modified asphalt mixtures. The paper also analyzes the factors influencing the compatibility between waste plastics and asphalt and summarizes the performance evaluation indicators for waste plastic-modified asphalt and its mixtures. It explores the research approaches and findings of domestic and international scholars and presents examples of waste plastics applications in pavement engineering. The author believes that there is a basic consensus that waste plastics can improve the high-temperature performance of asphalt. The use of cracking processes to solve the storage stability of waste plastic polymer-modified asphalt is the key to promoting its application. Additionally, the author anticipates that future research will concentrate on optimizing the recycling, processing, screening, and preparation of waste plastics, along with developing composite plastic modifiers to improve their compatibility and long-term performance in asphalt pavements.

Keywords: waste plastics, asphalt pavement, asphalt performance, asphalt modification

Procedia PDF Downloads 32
8909 Experimental Study of Solar Drying of Verbena in Three Types of Solar Dryers

Authors: Llham Lhoume, Rachid Tadili, Nora Arbaoui

Abstract:

One of the most crucial ways to combat food insecurity is to minimize crop losses, food drying is one of the most organic, effective, low-cost and energy-efficient food preservation methods. In this regard, we undertake in this study an experimental evaluation and analysis of the thermal performance of different natural convection drying systems: a solar greenhouse dryer, an indirect solar dryer with a single compartment and a solar dryer with two compartments. These systems have been implemented at the Solar Energy and Environment Laboratory of Mohammed V University (Morocco). The objective of this work is to study the feasibility of converting a solar greenhouse into a solar dryer for use during the summer. On the other hand, to study the thermal performances of this greenhouse dryer by comparing it with other solar dryers. The experimental study showed that the drying of verbena leaves took 6 hours in the indirect dryer 1, 3 hours in the indirect dryer, 2 and 4 hours in the greenhouse dryer, but the amortization period of the solar greenhouse dryer is lower than the other two solar dryers. The results of this study provide key information on the implementation and performance of these systems for drying a food of great global interest.

Keywords: solar energy, drying, agriculture, biotechnologie

Procedia PDF Downloads 75
8908 Effect of Silver Diamine Fluoride on Reducing Fungal Adhesion on Dentin

Authors: Rima Zakzouk, Noriko Hiraishi, Mohamed Mahdi Alshahni, Koichi Makimura, Junji Tagami

Abstract:

Background and Purpose: Silver diamine fluoride (SDF) is used to prevent and arrest dental caries. The aim of this study is to evaluate the effect of SDF on reducing Candida albicans adhesion on dentin. Materials and Methods: Bovine dentin disks (6×6 mm) were cut by Isomet and polished using grit silicon carbide papers down to 2000 in order to obtain flat dentin surfaces. Samples were divided into two groups. The first group (SDF group) was treated with 38% SDF for 3 min, while the other group (control group) did not undergo SDF treatment. All samples were exposed to C. albicans suspension, washed after 6 hours incubation at 30 °C before to be tested using XTT (2,3-Bis-(2-Methoxy-4-Nitro-5-Sulfophenyl)-2H-Tetrazolium-5-Carboxanilide) and real time PCR approaches. Statistical analyses of the results were performed at the significance level α = 0.05. Results: SDF inhibited C. albicans adhesion onto dentin. A significant difference was found between the SDF and control groups in both XTT and real time PCR tests. Conclusion: Using SDF to arrest the caries, could inhibit the Candida growth on dentin.

Keywords: silver diamine fluoride, dentin, real time PCR, XTT

Procedia PDF Downloads 159
8907 Optimization of Ultrasound-Assisted Extraction and Microwave-Assisted Acid Digestion for the Determination of Heavy Metals in Tea Samples

Authors: Abu Harera Nadeem, Kingsley Donkor

Abstract:

Tea is a popular beverage due to its flavour, aroma and antioxidant properties—with the most consumed varieties being green and black tea. Antioxidants in tea can lower the risk of Alzheimer’s and heart disease and obesity. However, these teas contain heavy metals such as Hg, Cd, or Pb, which can cause autoimmune diseases like Graves disease. In this study, 11 heavy metals in various commercial green, black, and oolong tea samples were determined using inductively coupled plasma-mass spectrometry (ICP-MS). Two methods of sample preparation were compared for accuracy and precision, which were microwave-assisted digestion and ultrasonic-assisted extraction. The developed method was further validated by detection limit, precision, and accuracy. Results showed that the proposed method was highly sensitive with detection limits within parts-per-billion levels. Reasonable method accuracy was obtained by spiked experiments. The findings of this study can be used to delve into the link between tea consumption and disease and to provide information for future studies on metal determination in tea.

Keywords: ICP-MS, green tea, black tea, microwave-assisted acid digestion, ultrasound-assisted extraction

Procedia PDF Downloads 118
8906 Cell-free Bioconversion of n-Octane to n-Octanol via a Heterogeneous and Bio-Catalytic Approach

Authors: Shanna Swart, Caryn Fenner, Athanasios Kotsiopoulos, Susan Harrison

Abstract:

Linear alkanes are produced as by-products from the increasing use of gas-to-liquid fuel technologies for synthetic fuel production and offer great potential for value addition. Their current use as low-value fuels and solvents do not maximize this potential. Therefore, attention has been drawn towards direct activation of these aliphatic alkanes to more useful products such as alcohols, aldehydes, carboxylic acids and derivatives. Cytochrome P450 monooxygenases (P450s) can be used for activation of these aliphatic alkanes using whole-cells or cell-free systems. Some limitations of whole-cell systems include reduced mass transfer, stability and possible side reactions. Since the P450 systems are little studied as cell-free systems, they form the focus of this study. Challenges of a cell-free system include co-factor regeneration, substrate availability and enzyme stability. Enzyme immobilization offers a positive outlook on this dilemma, as it may enhance stability of the enzyme. In the present study, 2 different P450s (CYP153A6 and CYP102A1) as well as the relevant accessory enzymes required for electron transfer (ferredoxin and ferredoxin reductase) and co-factor regeneration (glucose dehydrogenase) have been expressed in E. coli and purified by metal affinity chromatography. Glucose dehydrogenase (GDH), was used as a model enzyme to assess the potential of various enzyme immobilization strategies including; surface attachment on MagReSyn® microspheres with various functionalities and on electrospun nanofibers, using self-assembly based methods forming Cross Linked Enzymes (CLE), Cross Linked Enzyme Aggregates (CLEAs) and spherezymes as well as in a sol gel. The nanofibers were synthesized by electrospinning, which required the building of an electrospinning machine. The nanofiber morphology has been analyzed by SEM and binding will be further verified by FT-IR. Covalent attachment based methods showed limitations where only ferredoxin reductase and GDH retained activity after immobilization which were largely attributed to insufficient electron transfer and inactivation caused by the crosslinkers (60% and 90% relative activity loss for the free enzyme when using 0.5% glutaraldehyde and glutaraldehyde/ethylenediamine (1:1 v/v), respectively). So far, initial experiments with GDH have shown the most potential when immobilized via their His-tag onto the surface of MagReSyn® microspheres functionalized with Ni-NTA. It was found that Crude GDH could be simultaneously purified and immobilized with sufficient activity retention. Immobilized pure and crude GDH could be recycled 9 and 10 times, respectively, with approximately 10% activity remaining. The immobilized GDH was also more stable than the free enzyme after storage for 14 days at 4˚C. This immobilization strategy will also be applied to the P450s and optimized with regards to enzyme loading and immobilization time, as well as characterized and compared with the free enzymes. It is anticipated that the proposed immobilization set-up will offer enhanced enzyme stability (as well as reusability and easy recovery), minimal mass transfer limitation, with continuous co-factor regeneration and minimal enzyme leaching. All of which provide a positive outlook on this robust multi-enzyme system for efficient activation of linear alkanes as well as the potential for immobilization of various multiple enzymes, including multimeric enzymes for different bio-catalytic applications beyond alkane activation.

Keywords: alkane activation, cytochrome P450 monooxygenase, enzyme catalysis, enzyme immobilization

Procedia PDF Downloads 222
8905 Photocatalytic Hydrogen Production from Butanol over Ag/TiO2

Authors: Thabelo Nelushi, Michael Scurrell, Tumelo Seadira

Abstract:

Global warming is one of the most important environmental issues which arise from occurrence of gases such as carbon dioxide (CO2) and methane (CH4) in the atmosphere. Exposure to these greenhouse gases results in health risk. Hydrogen is regarded as an alternative energy source which is a clean energy carrier for the future. There are different methods to produce hydrogen such as steam reforming, coal gasification etc., however the challenge with these processes is that they emit CO and CO2 gases and are costly. Photocatalytic reforming is a substitute process which is fascinating due to the combination of solar energy and renewable sources and the use of semiconductor materials such as catalysts. TiO2 is regarded as the most promising catalysts. TiO2 nanoparticles prepared by hydrothermal method and Ag/TiO2 are being investigated for photocatalytic production of hydrogen from butanol. The samples were characterized by raman spectroscopy, TEM/SEM, XRD, XPS, EDAX, DRS and BET surface area. 2 wt% Ag-doped TiO2 nanoparticle showed enhanced hydrogen production compared to a non-doped TiO2. The results of characterization and photoactivity shows that TiO2 nanoparticles play a very important role in producing high hydrogen by utilizing solar irradiation.

Keywords: butanol, hydrogen production, silver particles, TiO2 nanoparticles

Procedia PDF Downloads 204
8904 Optimized Processing of Neural Sensory Information with Unwanted Artifacts

Authors: John Lachapelle

Abstract:

Introduction: Neural stimulation is increasingly targeted toward treatment of back pain, PTSD, Parkinson’s disease, and for sensory perception. Sensory recording during stimulation is important in order to examine neural response to stimulation. Most neural amplifiers (headstages) focus on noise efficiency factor (NEF). Conversely, neural headstages need to handle artifacts from several sources including power lines, movement (EMG), and neural stimulation itself. In this work a layered approach to artifact rejection is used to reduce corruption of the neural ENG signal by 60dBv, resulting in recovery of sensory signals in rats and primates that would previously not be possible. Methods: The approach combines analog techniques to reduce and handle unwanted signal amplitudes. The methods include optimized (1) sensory electrode placement, (2) amplifier configuration, and (3) artifact blanking when necessary. The techniques together are like concentric moats protecting a castle; only the wanted neural signal can penetrate. There are two conditions in which the headstage operates: unwanted artifact < 50mV, linear operation, and artifact > 50mV, fast-settle gain reduction signal limiting (covered in more detail in a separate paper). Unwanted Signals at the headstage input: Consider: (a) EMG signals are by nature < 10mV. (b) 60 Hz power line signals may be > 50mV with poor electrode cable conditions; with careful routing much of the signal is common to both reference and active electrode and rejected in the differential amplifier with <50mV remaining. (c) An unwanted (to the neural recorder) stimulation signal is attenuated from stimulation to sensory electrode. The voltage seen at the sensory electrode can be modeled Φ_m=I_o/4πσr. For a 1 mA stimulation signal, with 1 cm spacing between electrodes, the signal is <20mV at the headstage. Headstage ASIC design: The front end ASIC design is designed to produce < 1% THD at 50mV input; 50 times higher than typical headstage ASICs, with no increase in noise floor. This requires careful balance of amplifier stages in the headstage ASIC, as well as consideration of the electrodes effect on noise. The ASIC is designed to allow extremely small signal extraction on low impedance (< 10kohm) electrodes with configuration of the headstage ASIC noise floor to < 700nV/rt-Hz. Smaller high impedance electrodes (> 100kohm) are typically located closer to neural sources and transduce higher amplitude signals (> 10uV); the ASIC low-power mode conserves power with 2uV/rt-Hz noise. Findings: The enhanced neural processing ASIC has been compared with a commercial neural recording amplifier IC. Chronically implanted primates at MGH demonstrated the presence of commercial neural amplifier saturation as a result of large environmental artifacts. The enhanced artifact suppression headstage ASIC, in the same setup, was able to recover and process the wanted neural signal separately from the suppressed unwanted artifacts. Separately, the enhanced artifact suppression headstage ASIC was able to separate sensory neural signals from unwanted artifacts in mouse-implanted peripheral intrafascicular electrodes. Conclusion: Optimizing headstage ASICs allow observation of neural signals in the presence of large artifacts that will be present in real-life implanted applications, and are targeted toward human implantation in the DARPA HAPTIX program.

Keywords: ASIC, biosensors, biomedical signal processing, biomedical sensors

Procedia PDF Downloads 326
8903 Multiple Identity Construction among Multilingual Minorities: A Quantitative Sociolinguistic Case Study

Authors: Stefanie Siebenhütter

Abstract:

This paper aims to reveal criterions involved in the process of identity-forming among multilingual minority language speakers in Northeastern Thailand and in the capital Bangkok. Using sociolinguistic interviews and questionnaires, it is asked which factors are important for speakers and how they define their identity by their interactions socially as well as linguistically. One key question to answer is how sociolinguistic factors may force or diminish the process of forming social identity of multilingual minority speakers. However, the motivation for specific language use is rarely overt to the speaker’s themselves as well as to others. Therefore, identifying the intentions included in the process of identity construction is to approach by scrutinizing speaker’s behavior and attitudes. Combining methods used in sociolinguistics and social psychology allows uncovering the tools for identity construction that ethnic Kui uses to range themselves within a multilingual setting. By giving an overview of minority speaker’s language use in context of the specific border near multilingual situation and asking how speakers construe identity within this spatial context, the results exhibit some of the subtle and mostly unconscious criterions involved in the ongoing process of identity construction.

Keywords: social identity, identity construction, minority language, multilingualism, social networks, social boundaries

Procedia PDF Downloads 261
8902 A Proposed Framework for Software Redocumentation Using Distributed Data Processing Techniques and Ontology

Authors: Laila Khaled Almawaldi, Hiew Khai Hang, Sugumaran A. l. Nallusamy

Abstract:

Legacy systems are crucial for organizations, but their intricacy and lack of documentation pose challenges for maintenance and enhancement. Redocumentation of legacy systems is vital for automatically or semi-automatically creating documentation for software lacking sufficient records. It aims to enhance system understandability, maintainability, and knowledge transfer. However, existing redocumentation methods need improvement in data processing performance and document generation efficiency. This stems from the necessity to efficiently handle the extensive and complex code of legacy systems. This paper proposes a method for semi-automatic legacy system re-documentation using semantic parallel processing and ontology. Leveraging parallel processing and ontology addresses current challenges by distributing the workload and creating documentation with logically interconnected data. The paper outlines challenges in legacy system redocumentation and suggests a method of redocumentation using parallel processing and ontology for improved efficiency and effectiveness.

Keywords: legacy systems, redocumentation, big data analysis, parallel processing

Procedia PDF Downloads 39
8901 Machine Learning Methods for Flood Hazard Mapping

Authors: Stefano Zappacosta, Cristiano Bove, Maria Carmela Marinelli, Paola di Lauro, Katarina Spasenovic, Lorenzo Ostano, Giuseppe Aiello, Marco Pietrosanto

Abstract:

This paper proposes a novel neural network approach for assessing flood hazard mapping. The core of the model is a machine learning component fed by frequency ratios, namely statistical correlations between flood event occurrences and a selected number of topographic properties. The proposed hybrid model can be used to classify four different increasing levels of hazard. The classification capability was compared with the flood hazard mapping River Basin Plans (PAI) designed by the Italian Institute for Environmental Research and Defence, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale). The study area of Piemonte, an Italian region, has been considered without loss of generality. The frequency ratios may be used as a standalone block to model the flood hazard mapping. Nevertheless, the mixture with a neural network improves the classification power of several percentage points, and may be proposed as a basic tool to model the flood hazard map in a wider scope.

Keywords: flood modeling, hazard map, neural networks, hydrogeological risk, flood risk assessment

Procedia PDF Downloads 170
8900 Catastrophic Health Expenditures: Evaluating the Effectiveness of Nepal's National Health Insurance Program Using Propensity Score Matching and Doubly Robust Methodology

Authors: Simrin Kafle, Ulrika Enemark

Abstract:

Catastrophic health expenditure (CHE) is a critical issue in low- and middle-income countries like Nepal, exacerbating financial hardship among vulnerable households. This study assesses the effectiveness of Nepal’s National Health Insurance Program (NHIP), launched in 2015, to reduce out-of-pocket (OOP) healthcare costs and mitigate CHE. Conducted in Pokhara Metropolitan City, the study used an analytical cross-sectional design, sampling 1276 households through a two-stage random sampling method. Data was collected via face-to-face interviews between May and October 2023. The analysis was conducted using SPSS version 29, incorporating propensity score matching to minimize biases and create comparable groups of enrolled and non-enrolled households in the NHIP. PSM helped reduce confounding effects by matching households with similar baseline characteristics. Additionally, a doubly robust methodology was employed, combining propensity score adjustment with regression modeling to enhance the reliability of the results. This comprehensive approach ensured a more accurate estimation of the impact of NHIP enrollment on CHE. Among the 1276 samples, 534 households (41.8%) were enrolled in NHIP. Of them, 84.3% of households renewed their insurance card, though some cited long waiting times, lack of medications, and complex procedures as barriers to renewal. Approximately 57.3% of households reported known diseases before enrollment, with 49.8% attending routine health check-ups in the past year. The primary motivation for enrollment was encouragement from insurance employees (50.2%). The data indicates that 12.5% of enrolled households experienced CHE versus 7.5% among non-enrolled. Enrollment into NHIP does not contribute to lower CHE (AOR: 1.98, 95% CI: 1.21-3.24). Key factors associated with increased CHE risk were presence of non-communicable diseases (NCDs) (AOR: 3.94, 95% CI: 2.10-7.39), acute illnesses/injuries (AOR: 6.70, 95% CI: 3.97-11.30), larger household size (AOR: 3.09, 95% CI: 1.81-5.28), and households below the poverty line (AOR: 5.82, 95% CI: 3.05-11.09). Other factors such as gender, education level, caste/ethnicity, presence of elderly members, and under-five children also showed varying associations with CHE, though not all were statistically significant. The study concludes that enrollment in the NHIP does not significantly reduce the risk of CHE. The reason for this could be inadequate coverage, where high-cost medicines, treatments, and transportation costs are not fully included in the insurance package, leading to significant out-of-pocket expenses. We also considered the long waiting time, lack of medicines, and complex procedures for the utilization of NHIP benefits, which might result in the underuse of covered services. Finally, gaps in enrollment and retention might leave certain households vulnerable to CHE despite the existence of NHIP. Key factors contributing to increased CHE include NCDs, acute illnesses, larger household sizes, and poverty. To improve the program’s effectiveness, it is recommended that NHIP benefits and coverage be expanded to better protect against high healthcare costs. Additionally, simplifying the renewal process, addressing long waiting times, and enhancing the availability of services could improve member satisfaction and retention. Targeted financial protection measures should be implemented for high-risk groups, and efforts should be made to increase awareness and encourage routine health check-ups to prevent severe health issues that contribute to CHE.

Keywords: catastrophic health expenditure, effectiveness, national health insurance program, Nepal

Procedia PDF Downloads 16