Search results for: Rule Based Architecture
26322 An Implementation of Fuzzy Logic Technique for Prediction of the Power Transformer Faults
Authors: Omar M. Elmabrouk., Roaa Y. Taha., Najat M. Ebrahim, Sabbreen A. Mohammed
Abstract:
Power transformers are the most crucial part of power electrical system, distribution and transmission grid. This part is maintained using predictive or condition-based maintenance approach. The diagnosis of power transformer condition is performed based on Dissolved Gas Analysis (DGA). There are five main methods utilized for analyzing these gases. These methods are International Electrotechnical Commission (IEC) gas ratio, Key Gas, Roger gas ratio, Doernenburg, and Duval Triangle. Moreover, due to the importance of the transformers, there is a need for an accurate technique to diagnose and hence predict the transformer condition. The main objective of this technique is to avoid the transformer faults and hence to maintain the power electrical system, distribution and transmission grid. In this paper, the DGA was utilized based on the data collected from the transformer records available in the General Electricity Company of Libya (GECOL) which is located in Benghazi-Libya. The Fuzzy Logic (FL) technique was implemented as a diagnostic approach based on IEC gas ratio method. The FL technique gave better results and approved to be used as an accurate prediction technique for power transformer faults. Also, this technique is approved to be a quite interesting for the readers and the concern researchers in the area of FL mathematics and power transformer.Keywords: dissolved gas-in-oil analysis, fuzzy logic, power transformer, prediction
Procedia PDF Downloads 14726321 Sustainable Design in the Use of Deployable Structures
Authors: Umweni Osahon Joshua, Anton Ianakiev
Abstract:
Deployable structures have been used in various scenarios from moving roofs in stadia, space antennae or booms. There has been a lot of literature relating deployable structures but with main focus on space applications. The complexities in the design of deployable structures may be the reason only few have been constructed for earth based solutions. This paper intends to explore the possibilities of integrating sustainable design concepts in deployable structures. Key aspects of sustainable design of structures as applicable to deployable structures have not been explored. Sustainable design of structures have mainly been concerned with static structures in the built environment. However, very little literature, concepts or framework has been drafted as it relates to deployable structures or their integration to static structures as a model for sustainable design. This article seeks to address this flaw in sustainable design for structural engineering and to provide a framework for designing structures in a sustainable manner. This framework will apply to deployable structures for earth-based environments as a form of disaster relief measures and also as part of static structures in the built environment.Keywords: deployable structures, sustainable design, framework, earth-based environments
Procedia PDF Downloads 54226320 Scientometrics Analysis of Food Supply Chain Risk Assessment Literature: Based On Web of Science Record 1996-2014
Authors: Mohsen Shirani, Shadi Asadzandi, Micaela Demichela
Abstract:
This paper presents the results of a study to assess crucial aspects and the strength of the scientific basis of a typically interdisciplinary, applied field: food supply chain risk assessment research. Our approach is based on an advanced scientometrics analysis with novel elements to assess the influence and dissemination of research results and to measure interdisciplinary. This paper aims to describe the quantity and quality of the publication trends in food supply chain risk assessment. The population under study was composed of 266 articles from database web of science. The results were analyzed based on date of publication, type of document, language of the documents, source of publications, subject areas, authors and their affiliations, and the countries involved in developing the articles.Keywords: food supply chain, risk assessment, scientometrics, web of science
Procedia PDF Downloads 49626319 Graphene Based Materials as Novel Membranes for Water Desalination and Boron Separation
Authors: Francesca Risplendi, Li-Chiang Lin, Jeffrey C. Grossman, Giancarlo Cicero
Abstract:
Desalination is one of the most employed approaches to supply water in the context of a rapidly growing global water shortage. However, the most popular water filtration method available is the reverse osmosis (RO) technique, still suffers from important drawbacks, such as a large energy demands and high process costs. In addition some serious limitations have been recently discovered, among them, the boron problem seems to have a critical meaning. Boron has been found to have a dual effect on the living systems on Earth and the difference between boron deficiency and boron toxicity levels is quite small. The aim of this project is to develop a new generation of RO membranes based on porous graphene or reduced graphene oxide (rGO) able to remove salts from seawater and to reduce boron concentrations in the permeate to the level that meets the drinking or process water requirements, by means of a theoretical approach based on density functional theory and classical molecular dynamics. Computer simulations have been employed to investigate the relationship between the atomic structure of nanoporous graphene or rGO monolayer and its membrane properties in RO applications (i.e. water permeability and resilience at RO pressures). In addition, an emphasis has been given to multilayer nanoporous rGO and rGO flakes based membranes. By means of non-equilibrium MD simulations, we investigated the water transport mechanism permeating through such multilayer membrane focusing on the effect of slit widths and sheet geometries. These simulations allowed us to establish the implications of these graphene based materials as promising membrane properties for desalination plants and as boron filtration.Keywords: boron filtration, desalination, graphene membrane, reduced graphene oxide membrane
Procedia PDF Downloads 30126318 Optical Multicast over OBS Networks: An Approach Based on Code-Words and Tunable Decoders
Authors: Maha Sliti, Walid Abdallah, Noureddine Boudriga
Abstract:
In the frame of this work, we present an optical multicasting approach based on optical code-words. Our approach associates, in the edge node, an optical code-word to a group multicast address. In the core node, a set of tunable decoders are used to send a traffic data to multiple destinations based on the received code-word. The use of code-words, which correspond to the combination of an input port and a set of output ports, allows the implementation of an optical switching matrix. At the reception of a burst, it will be delayed in an optical memory. And, the received optical code-word is split to a set of tunable optical decoders. When it matches a configured code-word, the delayed burst is switched to a set of output ports.Keywords: optical multicast, optical burst switching networks, optical code-words, tunable decoder, virtual optical memory
Procedia PDF Downloads 61026317 Particle Swarm Optimization Algorithm vs. Genetic Algorithm for Image Watermarking Based Discrete Wavelet Transform
Authors: Omaima N. Ahmad AL-Allaf
Abstract:
Over communication networks, images can be easily copied and distributed in an illegal way. The copyright protection for authors and owners is necessary. Therefore, the digital watermarking techniques play an important role as a valid solution for authority problems. Digital image watermarking techniques are used to hide watermarks into images to achieve copyright protection and prevent its illegal copy. Watermarks need to be robust to attacks and maintain data quality. Therefore, we discussed in this paper two approaches for image watermarking, first is based on Particle Swarm Optimization (PSO) and the second approach is based on Genetic Algorithm (GA). Discrete wavelet transformation (DWT) is used with the two approaches separately for embedding process to cover image transformation. Each of PSO and GA is based on co-relation coefficient to detect the high energy coefficient watermark bit in the original image and then hide the watermark in original image. Many experiments were conducted for the two approaches with different values of PSO and GA parameters. From experiments, PSO approach got better results with PSNR equal 53, MSE equal 0.0039. Whereas GA approach got PSNR equal 50.5 and MSE equal 0.0048 when using population size equal to 100, number of iterations equal to 150 and 3×3 block. According to the results, we can note that small block size can affect the quality of image watermarking based PSO/GA because small block size can increase the search area of the watermarking image. Better PSO results were obtained when using swarm size equal to 100.Keywords: image watermarking, genetic algorithm, particle swarm optimization, discrete wavelet transform
Procedia PDF Downloads 22826316 Nine Foundational Interventions for Students with Autism Spectrum Disorders
Authors: Jennie Long, Marjorie Bock
Abstract:
Although the professional literature related to Autism Spectrum Disorder (ASD) has focused on successful interventions and strategies, there is a lack of documentation regarding which of these methods and supports are most foundational and essential for classroom use. Specifically, literature does not define the core foundational interventions and strategies that would be elemental for educators to use with students with an ASD diagnosis. From the increase in prevalence of autism spectrum disorders, to the challenge students with ASD pose in classrooms, to the requirement to implement evidence-based practice, rises an enormous challenge in the field of education. Foundational interventions should be in place the first day the student enters the classroom. The nine interventions are foundational in nature and because of the dramatic increase in prevalence there is currently a need for classroom programs to provide the foundation of basic educational skills as well as the specialty skills specific to the area of ASD utilizing the most current research. This article presents nine evidence-based intervention categories for implementation with students on the spectrum.Keywords: autism spectrum disorder, classroom, evidence-based, foundational
Procedia PDF Downloads 26526315 An Improved OCR Algorithm on Appearance Recognition of Electronic Components Based on Self-adaptation of Multifont Template
Authors: Zhu-Qing Jia, Tao Lin, Tong Zhou
Abstract:
The recognition method of Optical Character Recognition has been expensively utilized, while it is rare to be employed specifically in recognition of electronic components. This paper suggests a high-effective algorithm on appearance identification of integrated circuit components based on the existing methods of character recognition, and analyze the pros and cons.Keywords: optical character recognition, fuzzy page identification, mutual correlation matrix, confidence self-adaptation
Procedia PDF Downloads 54126314 Nonuniformity Correction Technique in Infrared Video Using Feedback Recursive Least Square Algorithm
Authors: Flavio O. Torres, Maria J. Castilla, Rodrigo A. Augsburger, Pedro I. Cachana, Katherine S. Reyes
Abstract:
In this paper, we present a scene-based nonuniformity correction method using a modified recursive least square algorithm with a feedback system on the updates. The feedback is designed to remove impulsive noise contamination images produced by a recursive least square algorithm by measuring the output of the proposed algorithm. The key advantage of the method is based on its capacity to estimate detectors parameters and then compensate for impulsive noise contamination image in a frame by frame basics. We define the algorithm and present several experimental results to demonstrate the efficacy of the proposed method in comparison to several previously published recursive least square-based methods. We show that the proposed method removes impulsive noise contamination image.Keywords: infrared focal plane arrays, infrared imaging, least mean square, nonuniformity correction
Procedia PDF Downloads 14626313 Predicting Destination Station Based on Public Transit Passenger Profiling
Authors: Xuyang Song, Jun Yin
Abstract:
The smart card has been an extremely universal tool in public transit. It collects a large amount of data on buses, urban railway transit, and ferries and provides possibilities for passenger profiling. This paper combines offline analysis of passenger profiling and real-time prediction to propose a method that can accurately predict the destination station in real-time when passengers tag on. Firstly, this article constructs a static database of user travel characteristics after identifying passenger travel patterns based on the Density-Based Spatial Clustering of Applications with Noise (DBSCAN). The dual travel passenger habits are identified: OD travel habits and D station travel habits. Then a rapid real-time prediction algorithm based on Transit Passenger Profiling is proposed, which can predict the destination of in-board passengers. This article combines offline learning with online prediction, providing a technical foundation for real-time passenger flow prediction, monitoring and simulation, and short-term passenger behavior and demand prediction. This technology facilitates the efficient and real-time acquisition of passengers' travel destinations and demand. The last, an actual case was simulated and demonstrated feasibility and efficiency.Keywords: travel behavior, destination prediction, public transit, passenger profiling
Procedia PDF Downloads 2126312 Analysis of the Fair Distribution of Urban Facilities in Kabul City by Population Modeling
Authors: Ansari Mohammad Reza, Hiroko Ono
Abstract:
In this study, we investigated how much of the urban facilities are fairly distributing in the city of Kabul based on the factor of population. To find the answer to this question we simulated a fair model for the distribution of investigated facilities in the city which is proposed based on the consideration of two factors; the number of users for each facility and the average distance of reach of each facility. Then the model was evaluated to make sure about its efficiency. And finally, the two—the existing pattern and the simulation model—were compared to find the degree of bias in the existing pattern of distribution of facilities in the city. The result of the study clearly clarified that the facilities are not fairly distributed in Kabul city based on the factor of population. Our analysis also revealed that the education services and the parks are the most and the worst fair distributed facilities in this regard.Keywords: Afghanistan, ArcGIS Software, Kabul City, fair distribution, urban facilities
Procedia PDF Downloads 18226311 Meteosat Second Generation Image Compression Based on the Radon Transform and Linear Predictive Coding: Comparison and Performance
Authors: Cherifi Mehdi, Lahdir Mourad, Ameur Soltane
Abstract:
Image compression is used to reduce the number of bits required to represent an image. The Meteosat Second Generation satellite (MSG) allows the acquisition of 12 image files every 15 minutes. Which results a large databases sizes. The transform selected in the images compression should contribute to reduce the data representing the images. The Radon transform retrieves the Radon points that represent the sum of the pixels in a given angle for each direction. Linear predictive coding (LPC) with filtering provides a good decorrelation of Radon points using a Predictor constitute by the Symmetric Nearest Neighbor filter (SNN) coefficients, which result losses during decompression. Finally, Run Length Coding (RLC) gives us a high and fixed compression ratio regardless of the input image. In this paper, a novel image compression method based on the Radon transform and linear predictive coding (LPC) for MSG images is proposed. MSG image compression based on the Radon transform and the LPC provides a good compromise between compression and quality of reconstruction. A comparison of our method with other whose two based on DCT and one on DWT bi-orthogonal filtering is evaluated to show the power of the Radon transform in its resistibility against the quantization noise and to evaluate the performance of our method. Evaluation criteria like PSNR and the compression ratio allows showing the efficiency of our method of compression.Keywords: image compression, radon transform, linear predictive coding (LPC), run lengthcoding (RLC), meteosat second generation (MSG)
Procedia PDF Downloads 42326310 Determining Earthquake Performances of Existing Reinforced Concrete Buildings by Using ANN
Authors: Musa H. Arslan, Murat Ceylan, Tayfun Koyuncu
Abstract:
In this study, an artificial intelligence-based (ANN based) analytical method has been developed for analyzing earthquake performances of the reinforced concrete (RC) buildings. 66 RC buildings with four to ten storeys were subjected to performance analysis according to the parameters which are the existing material, loading and geometrical characteristics of the buildings. The selected parameters have been thought to be effective on the performance of RC buildings. In the performance analyses stage of the study, level of performance possible to be shown by these buildings in case of an earthquake was determined on the basis of the 4-grade performance levels specified in Turkish Earthquake Code- 2007 (TEC-2007). After obtaining the 4-grade performance level, selected 23 parameters of each building have been matched with the performance level. In this stage, ANN-based fast evaluation algorithm mentioned above made an economic and rapid evaluation of four to ten storey RC buildings. According to the study, the prediction accuracy of ANN has been found about 74%.Keywords: artificial intelligence, earthquake, performance, reinforced concrete
Procedia PDF Downloads 46726309 Fog Computing- Network Based Computing
Authors: Navaneeth Krishnan, Chandan N. Bhagwat, Aparajit P. Utpat
Abstract:
Cloud Computing provides us a means to upload data and use applications over the internet. As the number of devices connecting to the cloud grows, there is undue pressure on the cloud infrastructure. Fog computing or Network Based Computing or Edge Computing allows to move a part of the processing in the cloud to the network devices present along the node to the cloud. Therefore the nodes connected to the cloud have a better response time. This paper proposes a method of moving the computation from the cloud to the network by introducing an android like appstore on the networking devices.Keywords: cloud computing, fog computing, network devices, appstore
Procedia PDF Downloads 39026308 Development of a Direct Immunoassay for Human Ferritin Using Diffraction-Based Sensing Method
Authors: Joel Ballesteros, Harriet Jane Caleja, Florian Del Mundo, Cherrie Pascual
Abstract:
Diffraction-based sensing was utilized in the quantification of human ferritin in blood serum to provide an alternative to label-based immunoassays currently used in clinical diagnostics and researches. The diffraction intensity was measured by the diffractive optics technology or dotLab™ system. Two methods were evaluated in this study: direct immunoassay and direct sandwich immunoassay. In the direct immunoassay, human ferritin was captured by human ferritin antibodies immobilized on an avidin-coated sensor while the direct sandwich immunoassay had an additional step for the binding of a detector human ferritin antibody on the analyte complex. Both methods were repeatable with coefficient of variation values below 15%. The direct sandwich immunoassay had a linear response from 10 to 500 ng/mL which is wider than the 100-500 ng/mL of the direct immunoassay. The direct sandwich immunoassay also has a higher calibration sensitivity with value 0.002 Diffractive Intensity (ng mL-1)-1) compared to the 0.004 Diffractive Intensity (ng mL-1)-1 of the direct immunoassay. The limit of detection and limit of quantification values of the direct immunoassay were found to be 29 ng/mL and 98 ng/mL, respectively, while the direct sandwich immunoassay has a limit of detection (LOD) of 2.5 ng/mL and a limit of quantification (LOQ) of 8.2 ng/mL. In terms of accuracy, the direct immunoassay had a percent recovery of 88.8-93.0% in PBS while the direct sandwich immunoassay had 94.1 to 97.2%. Based on the results, the direct sandwich immunoassay is a better diffraction-based immunoassay in terms of accuracy, LOD, LOQ, linear range, and sensitivity. The direct sandwich immunoassay was utilized in the determination of human ferritin in blood serum and the results are validated by Chemiluminescent Magnetic Immunoassay (CMIA). The calculated Pearson correlation coefficient was 0.995 and the p-values of the paired-sample t-test were less than 0.5 which show that the results of the direct sandwich immunoassay was comparable to that of CMIA and could be utilized as an alternative analytical method.Keywords: biosensor, diffraction, ferritin, immunoassay
Procedia PDF Downloads 35526307 Domain specific Ontology-Based Knowledge Extraction Using R-GNN and Large Language Models
Authors: Andrey Khalov
Abstract:
The rapid proliferation of unstructured data in IT infrastructure management demands innovative approaches for extracting actionable knowledge. This paper presents a framework for ontology-based knowledge extraction that combines relational graph neural networks (R-GNN) with large language models (LLMs). The proposed method leverages the DOLCE framework as the foundational ontology, extending it with concepts from ITSMO for domain-specific applications in IT service management and outsourcing. A key component of this research is the use of transformer-based models, such as DeBERTa-v3-large, for automatic entity and relationship extraction from unstructured texts. Furthermore, the paper explores how transfer learning techniques can be applied to fine-tune large language models (LLaMA) for using to generate synthetic datasets to improve precision in BERT-based entity recognition and ontology alignment. The resulting IT Ontology (ITO) serves as a comprehensive knowledge base that integrates domain-specific insights from ITIL processes, enabling more efficient decision-making. Experimental results demonstrate significant improvements in knowledge extraction and relationship mapping, offering a cutting-edge solution for enhancing cognitive computing in IT service environments.Keywords: ontology mapping, R-GNN, knowledge extraction, large language models, NER, knowlege graph
Procedia PDF Downloads 2226306 COVID-19, The Black Lives Matter Movement, and Race-Based Traumatic Stress
Authors: Claire Stafford, John Lewis, Ashley Stripling
Abstract:
The aim of this study is to examine the relationship between both the independent effects and intersection between COVID-19 and the Black Lives Matter (BLM) movement simultaneously to investigate how the two events have coincided with impacting race-based traumatic stress in Black Americans. Four groups will be surveyed: Black Americans who participated in BLM-related activism, Black Americans who did not participate in BLM-related activism, White Americans who participated in BLM-related activism, and White Americans who did not participate in BLM-related activism. Participants are between the ages of 30 and 50. All participants will be administered a Brief Trauma Questionnaire with an additional question asking whether or not they have ever tested positive for COVID-19. Based on prior findings, it is expected that Black Americans will have significantly higher levels of COVID-19 contraction, with Black Americans who participated in BLM-related activism having the highest levels of contractions. Additionally, Black Americans who participated in BLM-related activism will likely have the highest self-reported rates of traumatic experiences due to the compounding effect of both the pandemic and the BLM movement. With the development of the COVID-19 pandemic, stark racial disparities between Black and White Americans have become more defined. Compared to White Americans, Black Americans have more COVID-19-related cases and hospitalizations. Researchers must investigate and attempt to mitigate these disparities while simultaneously critically questioning the structure of our national health care system and how it serves our marginalized communities. Further, a critical gaze must be directed at the geopolitical climate of the United States in order to holistically look at how both the COVID-19 pandemic and the Black Lives Matter (BLM) movement have interacted and impacted race-based stress and trauma in African Americans.Keywords: COVID-19, black lives matter movement, race-based traumatic stress, activism
Procedia PDF Downloads 10226305 An Evaluation of the Efficacy of School-Based Suicide Prevention Programs
Authors: S. Wietrzychowski
Abstract:
The following review has identified specific programs, as well as the elements of these programs, that have been shown to be most effective in preventing suicide in schools. Suicide is an issue that affects many students each year. Although this is a prominent issue, there are few prevention programs used within schools. The primary objective of most prevention programs is to reduce risk factors such as depression and hopelessness, and increase protective factors like support systems and help-seeking behaviors. Most programs include a gatekeeper training model, education component, peer support group, and/or counseling/treatment. Research shows that some of these programs, like the Signs of Suicide and Youth Aware of Mental Health Programme, are effective in reducing suicide behaviors and increasing protective factors. These programs have been implemented in many countries across the world and have shown promising results. Since schools can provide easy access to adolescents, implement education programs, and train staff members and students how to identify and to report suicide behaviors, school-based programs seem to be the best way to prevent suicide among adolescents. Early intervention may be an effective way to prevent suicide. Although, since early intervention is not always an option, school-based programs in high schools have also been shown to decrease suicide attempts by up to 50%. As a result of this presentation, participants will be able to 1.) list at least 2 evidence-based suicide prevention programs, 2.) identify at least 3 factors which protect against suicide, and 3.) describe at least 3 risk factors for suicide.Keywords: school, suicide, prevention, programs
Procedia PDF Downloads 34826304 The Evaluation of the Performance of CaCO3/Polymer Nano-Composites for the Preservation of Historic Limestone Monuments
Authors: Mohammed Badereldien, Rezk Diab, Mohamoud Ali, Ayman Aboelkassem
Abstract:
The stone surfaces of historical architectural heritage in Egypt are under threat from of various environmental factors such as temperature fluctuation, humidity, pollution, and microbes. Due to these factors, the facades of buildings are deteriorating deformation and disfiguration of external decoration and the formation of black accretion also often from the stone works. The aim of this study is to evaluate the effectiveness of CaCO₃ nano-particles as consolidation and protection material for calcareous stone monuments. Selected tests were carried out in order to estimate the superficial consolidating and protective effect of the treatment. When applied the nanoparticles dispersed in the acrylic copolymer; poly ethylmethacrylate (EMA)/methylacrylate (MA) (70/30, respectively) (EMA)/methylacrylate (MA) (70/30, respectively). The synthesis process of CaCO₃ nanoparticles/polymer nano-composite was prepared using in situ emulsion polymerization system. The consolidation and protection were characterized by TEM, while the penetration depth, re-aggregating effects of the deposited phase, and the surface morphology before and after treatment were examined by SEM (Scanning Electron Microscopy). Improvement of the stones' mechanical properties was evaluated by compressive strength tests. Changes in water-interaction properties were evaluated by water absorption capillarity measurements, and colorimetric measurements were used to evaluate the optical appearance. Together the results appear to demonstrate that CaCO₃/polymer nanocomposite is an efficient material for the consolidation of limestone architecture and monuments. As compared with samples treated with pure acrylic copolymer without Calcium carbonate nanoparticles, for example, CaCO₃ nanoparticles are completely compatible, strengthening limestone against thermal aging and improving its mechanical properties.Keywords: calcium carbonate nanoparticles, consolidation, nanocomposites, calcareous stone, colorimetric measurements, compressive strength
Procedia PDF Downloads 13726303 A New Aggregation Operator for Trapezoidal Fuzzy Numbers Based On the Geometric Means of the Left and Right Line Slopes
Authors: Manju Pandey, Nilay Khare, S. C. Shrivastava
Abstract:
This paper is the final in a series, which has defined two new classes of aggregation operators for triangular and trapezoidal fuzzy numbers based on the geometrical characteristics of their fuzzy membership functions. In the present paper, a new aggregation operator for trapezoidal fuzzy numbers has been defined. The new operator is based on the geometric mean of the membership lines to the left and right of the maximum possibility interval. The operator is defined and the analytical relationships have been derived. Computation of the aggregate is demonstrated with a numerical example. Corresponding arithmetic and geometric aggregates as well as results from the recent work of the authors on TrFN aggregates have also been computed.Keywords: LR fuzzy number, interval fuzzy number, triangular fuzzy number, trapezoidal fuzzy number, apex angle, left apex angle, right apex angle, aggregation operator, arithmetic and geometric mean
Procedia PDF Downloads 47726302 Integrating Sustainable Development Goals in Teaching Mathematics Using Project Based Learning
Authors: S. Goel
Abstract:
In the current scenario, education should be realistic and nature-friendly. The earlier definition of education was restricted to the holistic development of the child which help them to increase their capacity and helps in social upliftment. But such definition gives a more individualistic aim of education. Due to that individualistic aim, we have become disconnected from nature. So, a school should be a place which provides students with an area to explore. They should get practical learning or learning from nature which is also propounded by Rousseau in the mid-eighteenth century. Integrating Sustainable development goals in the school curriculum will make it possible to connect the nature with the lives of the children in the classroom. Then, students will be more aware and sensitive towards their social and natural surroundings. The research attempts to examine the efficiency of project-based learning in mathematics to create awareness around sustainable development goals. The major finding of the research was that students are less aware of sustainable development goals, but when given time and an appropriate learning environment, students can be made aware of these goals. In this research, project-based learning was used to make students aware of sustainable development goals. Students were given pre test and post test which helped in analyzing their performance. After the intervention, post test result showed that mathematics projects can create an awareness of sustainable development goals.Keywords: holistic development, natural learning, project based learning, sustainable development goals
Procedia PDF Downloads 18126301 Highly Sensitive Fiber-Optic Curvature Sensor Based on Four Mode Fiber
Authors: Qihang Zeng, Wei Xu, Ying Shen, Changyuan Yu
Abstract:
In this paper, a highly sensitive fiber-optic curvature sensor based on four mode fiber (FMF) is presented and investigated. The proposed sensing structure is constructed by fusing a section of FMF into two standard single mode fibers (SMFs) concatenated with two no core fiber (NCF), i.e., SMF-NCF-FMF-NCF-SMF structure is fabricated. The length of the NCF is very short about 1 millimeter acting as exciting/recoupling the light from/into the core of the SMF, while the FMF is with 3 centimeters long supporting four eigenmodes including LP₀₁, LP₁₁, LP₂₁ and LP₀₂. High core modes in FMF can be effectively stimulated owing to mismatched mode field distribution and the mainly sensing principle is based on modal interferometer spectrum analysis. Different curvatures induce different strains on the FMF such that affecting the modal excitation, resulting spectrum shifts. One can get the curvature value by tracking the wavelength shifting. Experiments have been done to address the sensing performance, which is about 7.8 nm/m⁻¹ within a range of 1.90 m⁻¹~3.18 m⁻¹.Keywords: curvature, four mode fiber, highly sensitive, modal interferometer
Procedia PDF Downloads 19426300 Embedded Test Framework: A Solution Accelerator for Embedded Hardware Testing
Authors: Arjun Kumar Rath, Titus Dhanasingh
Abstract:
Embedded product development requires software to test hardware functionality during development and finding issues during manufacturing in larger quantities. As the components are getting integrated, the devices are tested for their full functionality using advanced software tools. Benchmarking tools are used to measure and compare the performance of product features. At present, these tests are based on a variety of methods involving varying hardware and software platforms. Typically, these tests are custom built for every product and remain unusable for other variants. A majority of the tests goes undocumented, not updated, unusable when the product is released. To bridge this gap, a solution accelerator in the form of a framework can address these issues for running all these tests from one place, using an off-the-shelf tests library in a continuous integration environment. There are many open-source test frameworks or tools (fuego. LAVA, AutoTest, KernelCI, etc.) designed for testing embedded system devices, with each one having several unique good features, but one single tool and framework may not satisfy all of the testing needs for embedded systems, thus an extensible framework with the multitude of tools. Embedded product testing includes board bring-up testing, test during manufacturing, firmware testing, application testing, and assembly testing. Traditional test methods include developing test libraries and support components for every new hardware platform that belongs to the same domain with identical hardware architecture. This approach will have drawbacks like non-reusability where platform-specific libraries cannot be reused, need to maintain source infrastructure for individual hardware platforms, and most importantly, time is taken to re-develop test cases for new hardware platforms. These limitations create challenges like environment set up for testing, scalability, and maintenance. A desirable strategy is certainly one that is focused on maximizing reusability, continuous integration, and leveraging artifacts across the complete development cycle during phases of testing and across family of products. To get over the stated challenges with the conventional method and offers benefits of embedded testing, an embedded test framework (ETF), a solution accelerator, is designed, which can be deployed in embedded system-related products with minimal customizations and maintenance to accelerate the hardware testing. Embedded test framework supports testing different hardwares including microprocessor and microcontroller. It offers benefits such as (1) Time-to-Market: Accelerates board brings up time with prepacked test suites supporting all necessary peripherals which can speed up the design and development stage(board bring up, manufacturing and device driver) (2) Reusability-framework components isolated from the platform-specific HW initialization and configuration makes the adaptability of test cases across various platform quick and simple (3) Effective build and test infrastructure with multiple test interface options and preintegrated with FUEGO framework (4) Continuos integration - pre-integrated with Jenkins which enabled continuous testing and automated software update feature. Applying the embedded test framework accelerator throughout the design and development phase enables to development of the well-tested systems before functional verification and improves time to market to a large extent.Keywords: board diagnostics software, embedded system, hardware testing, test frameworks
Procedia PDF Downloads 14926299 Outcome-Based Education as Mediator of the Effect of Blended Learning on the Student Performance in Statistics
Authors: Restituto I. Rodelas
Abstract:
The higher education has adopted the outcomes-based education from K-12. In this approach, the teacher uses any teaching and learning strategies that enable the students to achieve the learning outcomes. The students may be required to exert more effort and figure things out on their own. Hence, outcomes-based students are assumed to be more responsible and more capable of applying the knowledge learned. Another approach that the higher education in the Philippines is starting to adopt from other countries is blended learning. This combination of classroom and fully online instruction and learning is expected to be more effective. Participating in the online sessions, however, is entirely up to the students. Thus, the effect of blended learning on the performance of students in Statistics may be mediated by outcomes-based education. If there is a significant positive mediating effect, then blended learning can be optimized by integrating outcomes-based education. In this study, the sample will consist of four blended learning Statistics classes at Jose Rizal University in the second semester of AY 2015–2016. Two of these classes will be assigned randomly to the experimental group that will be handled using outcomes-based education. The two classes in the control group will be handled using the traditional lecture approach. Prior to the discussion of the first topic, a pre-test will be administered. The same test will be given as posttest after the last topic is covered. In order to establish equality of the groups’ initial knowledge, single factor ANOVA of the pretest scores will be performed. Single factor ANOVA of the posttest-pretest score differences will also be conducted to compare the performance of the experimental and control groups. When a significant difference is obtained in any of these ANOVAs, post hoc analysis will be done using Tukey's honestly significant difference test (HSD). Mediating effect will be evaluated using correlation and regression analyses. The groups’ initial knowledge are equal when the result of pretest scores ANOVA is not significant. If the result of score differences ANOVA is significant and the post hoc test indicates that the classes in the experimental group have significantly different scores from those in the control group, then outcomes-based education has a positive effect. Let blended learning be the independent variable (IV), outcomes-based education be the mediating variable (MV), and score difference be the dependent variable (DV). There is mediating effect when the following requirements are satisfied: significant correlation of IV to DV, significant correlation of IV to MV, significant relationship of MV to DV when both IV and MV are predictors in a regression model, and the absolute value of the coefficient of IV as sole predictor is larger than that when both IV and MV are predictors. With a positive mediating effect of outcomes-base education on the effect of blended learning on student performance, it will be recommended to integrate outcomes-based education into blended learning. This will yield the best learning results.Keywords: outcome-based teaching, blended learning, face-to-face, student-centered
Procedia PDF Downloads 29226298 An Investigation of Raw Material Effects on Nano SiC Based Foam Glass Production
Authors: Aylin Sahin, Yasemin Kilic, Abdulkadir Sari, Burcu Duymaz, Mustafa Kara
Abstract:
Foam glass is an innovative material which composed of glass and carbon/carbonate based minerals; and has incomparable properties like light weight, high thermal insulation and cellular structure with sufficient rigidity. In the present study, the effects of the glass type and mineral addition on the foam glass properties were investigated. Nano sized SiC was fixed as foaming agent at the whole of the samples, mixed glass waste and sheet glass were selectively used as glass sources; finally Al₂O₃ was optionally used as mineral additive. These raw material powders were mixed homogenously, pressed at same pressure and sintered at same schedule. Finally, obtained samples were characterized based on the required properties of foam glass material, and optimum results were determined. At the end of the study, 0.049 W/mK thermal conductivity, 72 % porosity, and 0.21 kg/cm² apparent density with 2.41 MPa compressive strength values were achieved with using nano sized SiC, sheet glass and Al₂O₃ mineral additive. It can be said that the foam glass materials can be preferred as an alternative insulation material rather than polymeric based conventional insulation materials because of supplying high thermal insulation properties without containing unhealthy chemicals and burn risks.Keywords: foam glass, foaming, silicon carbide, waste glass
Procedia PDF Downloads 37326297 A Case for Strategic Landscape Infrastructure: South Essex Estuary Park
Authors: Alexandra Steed
Abstract:
Alexandra Steed URBAN was commissioned to undertake the South Essex Green and Blue Infrastructure Study (SEGBI) on behalf of the Association of South Essex Local Authorities (ASELA): a partnership of seven neighboring councils within the Thames Estuary. Located on London’s doorstep, the 70,000-hectare region is under extraordinary pressure for regeneration, further development, and economic expansion, yet faces extreme challenges: sea-level rise and inadequate flood defenses, stormwater flooding and threatened infrastructure, loss of internationally important habitats, significant existing community deprivation, and lack of connectivity and access to green space. The brief was to embrace these challenges in the creation of a document that would form a key part of ASELA’s Joint Strategic Framework and feed into local plans and master plans. Thus, helping to tackle climate change, ecological collapse, and social inequity at a regional scale whilst creating a relationship and awareness between urban communities and the surrounding landscapes and nature. The SEGBI project applied a ‘land-based’ methodology, combined with a co-design approach involving numerous stakeholders, to explore how living infrastructure can address these significant issues, reshape future planning and development, and create thriving places for the whole community of life. It comprised three key stages, including Baseline Review; Green and Blue Infrastructure Assessment; and the final Green and Blue Infrastructure Report. The resulting proposals frame an ambitious vision for the delivery of a new regional South Essex Estuary (SEE) Park – 24,000 hectares of protected and connected landscapes. This unified parkland system will drive effective place-shaping and “leveling up” for the most deprived communities while providing large-scale nature recovery and biodiversity net gain. Comprehensive analysis and policy recommendations ensure best practices will be embedded within planning documents and decisions guiding future development. Furthermore, a Natural Capital Account was undertaken as part of the strategy showing the tremendous economic value of the natural assets. This strategy sets a pioneering precedent that demonstrates how the prioritisation of living infrastructure has the capacity to address climate change and ecological collapse, while also supporting sustainable housing, healthier communities, and resilient infrastructures. It was only achievable through a collaborative and cross-boundary approach to strategic planning and growth, with a shared vision of place, and a strong commitment to delivery. With joined-up thinking and a joined-up region, a more impactful plan for South Essex was developed that will lead to numerous environmental, social, and economic benefits across the region, and enhancing the landscape and natural environs on the periphery of one of the largest cities in the world.Keywords: climate change, green and blue infrastructure, landscape architecture, master planning, regional planning, social equity
Procedia PDF Downloads 10026296 Growth Performance and Nutrient Digestibility of Cirrhinus mrigala Fingerlings Fed on Sunflower Meal Based Diet Supplemented with Phytase
Authors: Syed Makhdoom Hussain, Muhammad Afzal, Farhat Jabeen, Arshad Javid, Tasneem Hameed
Abstract:
A feeding trial was conducted with Cirrhinus mrigala fingerlings to study the effects of microbial phytase with graded levels (0, 500, 1000, 1500, and 2000 FTUkg-1) by sunflower meal based diet on growth performance and nutrient digestibility. The chromic oxide was added as an indigestible marker in the diets. Three replicate groups of 15 fish (Average wt 5.98 g fish-1) were fed once a day and feces were collected twice daily. The results of present study showed improved growth and feed performance of Cirrhinus mrigala fingerlings in response to phytase supplementation. Maximum growth performance was obtained by the fish fed on test diet-III having 1000 FTU kg-1 phytase level. Similarly, nutrient digestibility was also significantly increased (p<0.05) by phytase supplementation. Digestibility coefficients for sunflower meal based diet increased 15.76%, 17.70%, and 12.70% for crude protein, crude fat and apparent gross energy as compared to the reference diet, respectively at 1000 FTU kg-1 level. Again, maximum response of nutrient digestibility was recorded at the phytase level of 1000 FTU kg-1 diet. It was concluded that the phytase supplementation to sunflower meal based diet at 1000 FTU kg-1 level is optimum to release adequate chelated nutrients for maximum growth performance of C. mrigala fingerlings. Our results also suggested that phytase supplementation to sunflower meal based diet can help in the development of sustainable aquaculture by reducing the feed cost and nutrient discharge through feces in the aquatic ecosystem.Keywords: sunflower meal, Cirrhinus mrigala, growth, nutrient digestibility, phytase
Procedia PDF Downloads 30526295 Livestock Activity Monitoring Using Movement Rate Based on Subtract Image
Authors: Keunho Park, Sunghwan Jeong
Abstract:
The 4th Industrial Revolution, the next-generation industrial revolution, which is made up of convergence of information and communication technology (ICT), is no exception to the livestock industry, and various studies are being conducted to apply the livestock smart farm. In order to monitor livestock using sensors, it is necessary to drill holes in the organs such as the nose, ears, and even the stomach of the livestock to wear or insert the sensor into the livestock. This increases the stress of livestock, which in turn lowers the quality of livestock products or raises the issue of animal ethics, which has become a major issue in recent years. In this paper, we conducted a study to monitor livestock activity based on vision technology, effectively monitoring livestock activity without increasing animal stress and violating animal ethics. The movement rate was calculated based on the difference images between the frames, and the livestock activity was evaluated. As a result, the average F1-score was 96.67.Keywords: barn monitoring, livestock, machine vision, smart farm
Procedia PDF Downloads 12626294 Analysis of Cooperative Learning Behavior Based on the Data of Students' Movement
Authors: Wang Lin, Li Zhiqiang
Abstract:
The purpose of this paper is to analyze the cooperative learning behavior pattern based on the data of students' movement. The study firstly reviewed the cooperative learning theory and its research status, and briefly introduced the k-means clustering algorithm. Then, it used clustering algorithm and mathematical statistics theory to analyze the activity rhythm of individual student and groups in different functional areas, according to the movement data provided by 10 first-year graduate students. It also focused on the analysis of students' behavior in the learning area and explored the law of cooperative learning behavior. The research result showed that the cooperative learning behavior analysis method based on movement data proposed in this paper is feasible. From the results of data analysis, the characteristics of behavior of students and their cooperative learning behavior patterns could be found.Keywords: behavior pattern, cooperative learning, data analyze, k-means clustering algorithm
Procedia PDF Downloads 18926293 Towards Competence-Based Regulatory Sciences Education in Sub-Saharan Africa: Identification of Competencies
Authors: Abigail Ekeigwe, Bethany McGowan, Loran C. Parker, Stephen Byrn, Kari L. Clase
Abstract:
There are growing calls in the literature to develop and implement competency-based regulatory sciences education (CBRSE) in sub-Saharan Africa to expand and create a pipeline of a competent workforce of regulatory scientists. A defined competence framework is an essential component in developing competency-based education. However, such a competence framework is not available for regulatory scientists in sub-Saharan Africa. The purpose of this research is to identify entry-level competencies for inclusion in a competency framework for regulatory scientists in sub-Saharan Africa as a first step in developing CBRSE. The team systematically reviewed the literature following the PRISMA guidelines for systematic reviews and based on a pre-registered protocol on Open Science Framework (OSF). The protocol has the search strategy and the inclusion and exclusion criteria for publications. All included publications were coded to identify entry-level competencies for regulatory scientists. The team deductively coded the publications included in the study using the 'framework synthesis' model for systematic literature review. The World Health Organization’s conceptualization of competence guided the review and thematic synthesis. Topic and thematic codings were done using NVivo 12™ software. Based on the search strategy in the protocol, 2345 publications were retrieved. Twenty-two (n=22) of the retrieved publications met all the inclusion criteria for the research. Topic and thematic coding of the publications yielded three main domains of competence: knowledge, skills, and enabling behaviors. The knowledge domain has three sub-domains: administrative, regulatory governance/framework, and scientific knowledge. The skills domain has two sub-domains: functional and technical skills. Identification of competencies is the primal step that serves as a bedrock for curriculum development and competency-based education. The competencies identified in this research will help policymakers, educators, institutions, and international development partners design and implement competence-based regulatory science education in sub-Saharan Africa, ultimately leading to access to safe, quality, and effective medical products.Keywords: competence-based regulatory science education, competencies, systematic review, sub-Saharan Africa
Procedia PDF Downloads 198