Search results for: language learning model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 23777

Search results for: language learning model

8597 Comparing Two Interventions for Teaching Math to Pre-School Students with Autism

Authors: Hui Fang Huang Su, Jia Borror

Abstract:

This study compared two interventions for teaching math to preschool-aged students with autism spectrum disorder (ASD). The first is considered the business as usual (BAU) intervention, which uses the Strategies for Teaching Based on Autism Research (STAR) curriculum and discrete trial teaching as the instructional methodology. The second is the Math is Not Difficult (Project MIND) activity-embedded, naturalistic intervention. These interventions were randomly assigned to four preschool students with ASD classrooms and implemented over three months for Project Mind. We used measurement gained during the same three months for the STAR intervention. In addition, we used A quasi-experimental, pre-test/post-test design to compare the effectiveness of these two interventions in building mathematical knowledge and skills. The pre-post measures include three standardized instruments: the Test of Early Math Ability-3, the Problem Solving and Calculation subtests of the Woodcock-Johnson Test of Achievement IV, and the Bracken Test of Basic Concepts-3 Receptive. The STAR curriculum-based assessment is administered to all Baudhuin students three times per year, and we used the results in this study. We anticipated that implementing these two approaches would improve the mathematical knowledge and skills of children with ASD. Still, it is crucial to see whether a behavioral or naturalistic teaching approach leads to more significant results.

Keywords: early learning, autism, math for pre-schoolers, special education, teaching strategies

Procedia PDF Downloads 160
8596 Estimation of Break Points of Housing Price Growth Rate for Top MSAs in Texas Area

Authors: Hui Wu, Ye Li

Abstract:

Applying the structural break estimation method proposed by Perron and Bai (1998) to the housing price growth rate of top 5 MSAs in the Texas area, this paper estimated the structural break date for the growth rate of housing prices index. As shown in the estimation results, the break dates for each region are quite different, which indicates the heterogeneity of the housing market in response to macroeconomic conditions.

Keywords: structural break, housing prices index, ADF test, linear model

Procedia PDF Downloads 139
8595 Comparative Analysis of Feature Extraction and Classification Techniques

Authors: R. L. Ujjwal, Abhishek Jain

Abstract:

In the field of computer vision, most facial variations such as identity, expression, emotions and gender have been extensively studied. Automatic age estimation has been rarely explored. With age progression of a human, the features of the face changes. This paper is providing a new comparable study of different type of algorithm to feature extraction [Hybrid features using HAAR cascade & HOG features] & classification [KNN & SVM] training dataset. By using these algorithms we are trying to find out one of the best classification algorithms. Same thing we have done on the feature selection part, we extract the feature by using HAAR cascade and HOG. This work will be done in context of age group classification model.

Keywords: computer vision, age group, face detection

Procedia PDF Downloads 361
8594 Multimodal Integration of EEG, fMRI and Positron Emission Tomography Data Using Principal Component Analysis for Prognosis in Coma Patients

Authors: Denis Jordan, Daniel Golkowski, Mathias Lukas, Katharina Merz, Caroline Mlynarcik, Max Maurer, Valentin Riedl, Stefan Foerster, Eberhard F. Kochs, Andreas Bender, Ruediger Ilg

Abstract:

Introduction: So far, clinical assessments that rely on behavioral responses to differentiate coma states or even predict outcome in coma patients are unreliable, e.g. because of some patients’ motor disabilities. The present study was aimed to provide prognosis in coma patients using markers from electroencephalogram (EEG), blood oxygen level dependent (BOLD) functional magnetic resonance imaging (fMRI) and [18F]-fluorodeoxyglucose (FDG) positron emission tomography (PET). Unsuperwised principal component analysis (PCA) was used for multimodal integration of markers. Methods: Approved by the local ethics committee of the Technical University of Munich (Germany) 20 patients (aged 18-89) with severe brain damage were acquired through intensive care units at the Klinikum rechts der Isar in Munich and at the Therapiezentrum Burgau (Germany). At the day of EEG/fMRI/PET measurement (date I) patients (<3.5 month in coma) were grouped in the minimal conscious state (MCS) or vegetative state (VS) on the basis of their clinical presentation (coma recovery scale-revised, CRS-R). Follow-up assessment (date II) was also based on CRS-R in a period of 8 to 24 month after date I. At date I, 63 channel EEG (Brain Products, Gilching, Germany) was recorded outside the scanner, and subsequently simultaneous FDG-PET/fMRI was acquired on an integrated Siemens Biograph mMR 3T scanner (Siemens Healthineers, Erlangen Germany). Power spectral densities, permutation entropy (PE) and symbolic transfer entropy (STE) were calculated in/between frontal, temporal, parietal and occipital EEG channels. PE and STE are based on symbolic time series analysis and were already introduced as robust markers separating wakefulness from unconsciousness in EEG during general anesthesia. While PE quantifies the regularity structure of the neighboring order of signal values (a surrogate of cortical information processing), STE reflects information transfer between two signals (a surrogate of directed connectivity in cortical networks). fMRI was carried out using SPM12 (Wellcome Trust Center for Neuroimaging, University of London, UK). Functional images were realigned, segmented, normalized and smoothed. PET was acquired for 45 minutes in list-mode. For absolute quantification of brain’s glucose consumption rate in FDG-PET, kinetic modelling was performed with Patlak’s plot method. BOLD signal intensity in fMRI and glucose uptake in PET was calculated in 8 distinct cortical areas. PCA was performed over all markers from EEG/fMRI/PET. Prognosis (persistent VS and deceased patients vs. recovery to MCS/awake from date I to date II) was evaluated using the area under the curve (AUC) including bootstrap confidence intervals (CI, *: p<0.05). Results: Prognosis was reliably indicated by the first component of PCA (AUC=0.99*, CI=0.92-1.00) showing a higher AUC when compared to the best single markers (EEG: AUC<0.96*, fMRI: AUC<0.86*, PET: AUC<0.60). CRS-R did not show prediction (AUC=0.51, CI=0.29-0.78). Conclusion: In a multimodal analysis of EEG/fMRI/PET in coma patients, PCA lead to a reliable prognosis. The impact of this result is evident, as clinical estimates of prognosis are inapt at time and could be supported by quantitative biomarkers from EEG, fMRI and PET. Due to the small sample size, further investigations are required, in particular allowing superwised learning instead of the basic approach of unsuperwised PCA.

Keywords: coma states and prognosis, electroencephalogram, entropy, functional magnetic resonance imaging, machine learning, positron emission tomography, principal component analysis

Procedia PDF Downloads 333
8593 A Study on Finite Element Modelling of Earth Retaining Wall Anchored by Deadman Anchor

Authors: K. S. Chai, S. H. Chan

Abstract:

In this paper, the earth retaining wall anchored by discrete deadman anchor to support excavations in sand is modelled and analysed by finite element analysis. A study is conducted to examine how deadman anchorage system helps in reducing the deflection of earth retaining wall. A simplified numerical model is suggested in order to reduce the simulation duration. A comparison between 3-D and 2-D finite element analyses is illustrated.

Keywords: finite element, earth retaining wall, deadman anchor, sand

Procedia PDF Downloads 474
8592 Multiscale Modeling of Damage in Textile Composites

Authors: Jaan-Willem Simon, Bertram Stier, Brett Bednarcyk, Evan Pineda, Stefanie Reese

Abstract:

Textile composites, in which the reinforcing fibers are woven or braided, have become very popular in numerous applications in aerospace, automotive, and maritime industry. These textile composites are advantageous due to their ease of manufacture, damage tolerance, and relatively low cost. However, physics-based modeling of the mechanical behavior of textile composites is challenging. Compared to their unidirectional counterparts, textile composites introduce additional geometric complexities, which cause significant local stress and strain concentrations. Since these internal concentrations are primary drivers of nonlinearity, damage, and failure within textile composites, they must be taken into account in order for the models to be predictive. The macro-scale approach to modeling textile-reinforced composites treats the whole composite as an effective, homogenized material. This approach is very computationally efficient, but it cannot be considered predictive beyond the elastic regime because the complex microstructural geometry is not considered. Further, this approach can, at best, offer a phenomenological treatment of nonlinear deformation and failure. In contrast, the mesoscale approach to modeling textile composites explicitly considers the internal geometry of the reinforcing tows, and thus, their interaction, and the effects of their curved paths can be modeled. The tows are treated as effective (homogenized) materials, requiring the use of anisotropic material models to capture their behavior. Finally, the micro-scale approach goes one level lower, modeling the individual filaments that constitute the tows. This paper will compare meso- and micro-scale approaches to modeling the deformation, damage, and failure of textile-reinforced polymer matrix composites. For the mesoscale approach, the woven composite architecture will be modeled using the finite element method, and an anisotropic damage model for the tows will be employed to capture the local nonlinear behavior. For the micro-scale, two different models will be used, the one being based on the finite element method, whereas the other one makes use of an embedded semi-analytical approach. The goal will be the comparison and evaluation of these approaches to modeling textile-reinforced composites in terms of accuracy, efficiency, and utility.

Keywords: multiscale modeling, continuum damage model, damage interaction, textile composites

Procedia PDF Downloads 346
8591 Estimating CO₂ Storage Capacity under Geological Uncertainty Using 3D Geological Modeling of Unconventional Reservoir Rocks in Block nv32, Shenvsi Oilfield, China

Authors: Ayman Mutahar Alrassas, Shaoran Ren, Renyuan Ren, Hung Vo Thanh, Mohammed Hail Hakimi, Zhenliang Guan

Abstract:

The significant effect of CO₂ on global climate and the environment has gained more concern worldwide. Enhance oil recovery (EOR) associated with sequestration of CO₂ particularly into the depleted oil reservoir is considered the viable approach under financial limitations since it improves the oil recovery from the existing oil reservoir and boosts the relation between global-scale of CO₂ capture and geological sequestration. Consequently, practical measurements are required to attain large-scale CO₂ emission reduction. This paper presents an integrated modeling workflow to construct an accurate 3D reservoir geological model to estimate the storage capacity of CO₂ under geological uncertainty in an unconventional oil reservoir of the Paleogene Shahejie Formation (Es1) in the block Nv32, Shenvsi oilfield, China. In this regard, geophysical data, including well logs of twenty-two well locations and seismic data, were combined with geological and engineering data and used to construct a 3D reservoir geological modeling. The geological modeling focused on four tight reservoir units of the Shahejie Formation (Es1-x1, Es1-x2, Es1-x3, and Es1-x4). The validated 3D reservoir models were subsequently used to calculate the theoretical CO₂ storage capacity in the block Nv32, Shenvsi oilfield. Well logs were utilized to predict petrophysical properties such as porosity and permeability, and lithofacies and indicate that the Es1 reservoir units are mainly sandstone, shale, and limestone with a proportion of 38.09%, 32.42%, and 29.49, respectively. Well log-based petrophysical results also show that the Es1 reservoir units generally exhibit 2–36% porosity, 0.017 mD to 974.8 mD permeability, and moderate to good net to gross ratios. These estimated values of porosity, permeability, lithofacies, and net to gross were up-scaled and distributed laterally using Sequential Gaussian Simulation (SGS) and Simulation Sequential Indicator (SIS) methods to generate 3D reservoir geological models. The reservoir geological models show there are lateral heterogeneities of the reservoir properties and lithofacies, and the best reservoir rocks exist in the Es1-x4, Es1-x3, and Es1-x2 units, respectively. In addition, the reservoir volumetric of the Es1 units in block Nv32 was also estimated based on the petrophysical property models and fund to be between 0.554368

Keywords: CO₂ storage capacity, 3D geological model, geological uncertainty, unconventional oil reservoir, block Nv32

Procedia PDF Downloads 168
8590 Breast Cancer Risk is Predicted Using Fuzzy Logic in MATLAB Environment

Authors: S. Valarmathi, P. B. Harathi, R. Sridhar, S. Balasubramanian

Abstract:

Machine learning tools in medical diagnosis is increasing due to the improved effectiveness of classification and recognition systems to help medical experts in diagnosing breast cancer. In this study, ID3 chooses the splitting attribute with the highest gain in information, where gain is defined as the difference between before the split versus after the split. It is applied for age, location, taluk, stage, year, period, martial status, treatment, heredity, sex, and habitat against Very Serious (VS), Very Serious Moderate (VSM), Serious (S) and Not Serious (NS) to calculate the gain of information. The ranked histogram gives the gain of each field for the breast cancer data. The doctors use TNM staging which will decide the risk level of the breast cancer and play an important decision making field in fuzzy logic for perception based measurement. Spatial risk area (taluk) of the breast cancer is calculated. Result clearly states that Coimbatore (North and South) was found to be risk region to the breast cancer than other areas at 20% criteria. Weighted value of taluk was compared with criterion value and integrated with Map Object to visualize the results. ID3 algorithm shows the high breast cancer risk regions in the study area. The study has outlined, discussed and resolved the algorithms, techniques / methods adopted through soft computing methodology like ID3 algorithm for prognostic decision making in the seriousness of the breast cancer.

Keywords: ID3 algorithm, breast cancer, fuzzy logic, MATLAB

Procedia PDF Downloads 512
8589 Detailed Investigation of Thermal Degradation Mechanism and Product Characterization of Co-Pyrolysis of Indian Oil Shale with Rubber Seed Shell

Authors: Bhargav Baruah, Ali Shemsedin Reshad, Pankaj Tiwari

Abstract:

This work presents a detailed study on the thermal degradation kinetics of co-pyrolysis of oil shale of Upper Assam, India with rubber seed shell, and lab-scale pyrolysis to investigate the influence of pyrolysis parameters on product yield and composition of products. The physicochemical characteristics of oil shale and rubber seed shell were studied by proximate analysis, elemental analysis, Fourier transform infrared spectroscopy and X-ray diffraction. The physicochemical study showed the mixture to be of low moisture, high ash, siliceous, sour with the presence of aliphatic, aromatic, and phenolic compounds. The thermal decomposition of the oil shale with rubber seed shell was studied using thermogravimetric analysis at heating rates of 5, 10, 20, 30, and 50 °C/min. The kinetic study of the oil shale pyrolysis process was performed on the thermogravimetric (TGA) data using three model-free isoconversional methods viz. Friedman, Flynn Wall Ozawa (FWO), and Kissinger Akahira Sunnose (KAS). The reaction mechanisms were determined using the Criado master plot. The understanding of the composition of Indian oil shale and rubber seed shell and pyrolysis process kinetics can help to establish the experimental parameters for the extraction of valuable products from the mixture. Response surface methodology (RSM) was employed usinf central composite design (CCD) model to setup the lab-scale experiment using TGA data, and optimization of process parameters viz. heating rate, temperature, and particle size. The samples were pre-dried at 115°C for 24 hours prior to pyrolysis. The pyrolysis temperatures were set from 450 to 650 °C, at heating rates of 2 to 20°C/min. The retention time was set between 2 to 8 hours. The optimum oil yield was observed at 5°C/min and 550°C with a retention time of 5 hours. The pyrolytic oil and gas obtained at optimum conditions were subjected to characterization using Fourier transform infrared spectroscopy (FT-IR) gas chromatography and mass spectrometry (GC-MS) and nuclear magnetic resonance spectroscopy (NMR).

Keywords: Indian oil shale, rubber seed shell, co-pyrolysis, isoconversional methods, gas chromatography, nuclear magnetic resonance, Fourier transform infrared spectroscopy

Procedia PDF Downloads 137
8588 Flood Early Warning and Management System

Authors: Yogesh Kumar Singh, T. S. Murugesh Prabhu, Upasana Dutta, Girishchandra Yendargaye, Rahul Yadav, Rohini Gopinath Kale, Binay Kumar, Manoj Khare

Abstract:

The Indian subcontinent is severely affected by floods that cause intense irreversible devastation to crops and livelihoods. With increased incidences of floods and their related catastrophes, an Early Warning System for Flood Prediction and an efficient Flood Management System for the river basins of India is a must. Accurately modeled hydrological conditions and a web-based early warning system may significantly reduce economic losses incurred due to floods and enable end users to issue advisories with better lead time. This study describes the design and development of an EWS-FP using advanced computational tools/methods, viz. High-Performance Computing (HPC), Remote Sensing, GIS technologies, and open-source tools for the Mahanadi River Basin of India. The flood prediction is based on a robust 2D hydrodynamic model, which solves shallow water equations using the finite volume method. Considering the complexity of the hydrological modeling and the size of the basins in India, it is always a tug of war between better forecast lead time and optimal resolution at which the simulations are to be run. High-performance computing technology provides a good computational means to overcome this issue for the construction of national-level or basin-level flash flood warning systems having a high resolution at local-level warning analysis with a better lead time. High-performance computers with capacities at the order of teraflops and petaflops prove useful while running simulations on such big areas at optimum resolutions. In this study, a free and open-source, HPC-based 2-D hydrodynamic model, with the capability to simulate rainfall run-off, river routing, and tidal forcing, is used. The model was tested for a part of the Mahanadi River Basin (Mahanadi Delta) with actual and predicted discharge, rainfall, and tide data. The simulation time was reduced from 8 hrs to 3 hrs by increasing CPU nodes from 45 to 135, which shows good scalability and performance enhancement. The simulated flood inundation spread and stage were compared with SAR data and CWC Observed Gauge data, respectively. The system shows good accuracy and better lead time suitable for flood forecasting in near-real-time. To disseminate warning to the end user, a network-enabled solution is developed using open-source software. The system has query-based flood damage assessment modules with outputs in the form of spatial maps and statistical databases. System effectively facilitates the management of post-disaster activities caused due to floods, like displaying spatial maps of the area affected, inundated roads, etc., and maintains a steady flow of information at all levels with different access rights depending upon the criticality of the information. It is designed to facilitate users in managing information related to flooding during critical flood seasons and analyzing the extent of the damage.

Keywords: flood, modeling, HPC, FOSS

Procedia PDF Downloads 87
8587 A Collaborative Application of Six Sigma and Value Engineering in Supply Chain and Logistics

Authors: Arun Raja, Kevin Thomas, Sreyas Tribhu, S. P. Anbuudayasankar

Abstract:

This paper deals with the application of six sigma methodology in supply chain (SC) and logistics. A detailed cram about how the SC can be improved and its impact on the organization are dealt with and also how the quality plays a vital role in improving SC and logistics are identified. A simulation has been performed using the ARENA software to determine the process efficiency of a bottle manufacturing unit. Further, a Value Stream Mapping (VSM) analysis has been executed on the manufacturing process flow model and the manner by which Value Engineering (VE) holds a significant importance for quality assertion on the products is also studied.

Keywords: supply chain, six sigma, value engineering, logistics, quality

Procedia PDF Downloads 671
8586 A New Development Pathway And Innovative Solutions Through Food Security System

Authors: Osatuyi Kehinde Micheal

Abstract:

There is much research that has contributed to an improved understanding of the future of food security, especially during the COVID-19 pandemic. A pathway was developed by using a local community kitchen in Muizenberg in western cape province, cape town, south Africa, a case study to map out the future of food security in times of crisis. This kitchen aims to provide nutritious, affordable, plant-based meals to our community. It is also a place of diverse learning, sharing, empowering the volunteers, and growth to support the local economy and future resilience by sustaining our community kitchen for the community. This document contains an overview of the story of the community kitchen on how we create self-sustainability as a new pathway development to sustain the community and reduce Zero hunger in the regional food system. This paper describes the key elements of how we respond to covid-19 pandemic by sharing food parcels and creating 13 soup kitchens across the community to tackle the immediate response to covid-19 pandemic and agricultural systems by growing home food gardening in different homes, also having a consciousness Dry goods store to reduce Zero waste and a local currency as an innovation to reduce food crisis. Insights gained from our article and outreach and their value in how we create adaptation, transformation, and sustainability as a new development pathway to solve any future problem crisis in the food security system in our society.

Keywords: sustainability, food security, community development, adapatation, transformation

Procedia PDF Downloads 72
8585 Remote Assessment and Change Detection of GreenLAI of Cotton Crop Using Different Vegetation Indices

Authors: Ganesh B. Shinde, Vijaya B. Musande

Abstract:

Cotton crop identification based on the timely information has significant advantage to the different implications of food, economic and environment. Due to the significant advantages, the accurate detection of cotton crop regions using supervised learning procedure is challenging problem in remote sensing. Here, classifiers on the direct image are played a major role but the results are not much satisfactorily. In order to further improve the effectiveness, variety of vegetation indices are proposed in the literature. But, recently, the major challenge is to find the better vegetation indices for the cotton crop identification through the proposed methodology. Accordingly, fuzzy c-means clustering is combined with neural network algorithm, trained by Levenberg-Marquardt for cotton crop classification. To experiment the proposed method, five LISS-III satellite images was taken and the experimentation was done with six vegetation indices such as Simple Ratio, Normalized Difference Vegetation Index, Enhanced Vegetation Index, Green Atmospherically Resistant Vegetation Index, Wide-Dynamic Range Vegetation Index, Green Chlorophyll Index. Along with these indices, Green Leaf Area Index is also considered for investigation. From the research outcome, Green Atmospherically Resistant Vegetation Index outperformed with all other indices by reaching the average accuracy value of 95.21%.

Keywords: Fuzzy C-Means clustering (FCM), neural network, Levenberg-Marquardt (LM) algorithm, vegetation indices

Procedia PDF Downloads 309
8584 The Challenges to Information Communication Technology Integration in Mathematics Teaching and Learning

Authors: George Onomah

Abstract:

Background: The integration of information communication technology (ICT) in Mathematics education faces notable challenges, which this study aimed to dissect and understand. Objectives: The primary goal was to assess the internal and external factors affecting the adoption of ICT by in-service Mathematics teachers. Internal factors examined included teachers' pedagogical beliefs, prior teaching experience, attitudes towards computers, and proficiency with technology. External factors included the availability of technological resources, the level of ICT training received, the sufficiency of allocated time for technology use, and the institutional culture within educational environments. Methods: A descriptive survey design was employed to methodically investigate these factors. Data collection was carried out using a five-point Likert scale questionnaire, administered to a carefully selected sample of 100 in-service Mathematics teachers through a combination of purposive and convenience sampling techniques. Findings: Results from multiple regression analysis revealed a significant underutilization of ICT in Mathematics teaching, highlighting a pronounced deficiency in current classroom practices. Recommendations: The findings suggest an urgent need for educational department heads to implement regular and comprehensive ICT training programs aimed at enhancing teachers' technological capabilities and promoting the integration of ICT in Mathematics teaching methodologies.

Keywords: ICT, Mathematics, integration, barriers

Procedia PDF Downloads 35
8583 Investigating the Efficacy of Developing Critical Thinking through Literature Reading

Authors: Julie Chuah Suan Choo

Abstract:

Due to the continuous change in workforce and the demands of the global workplace, many employers had lamented that the majority of university graduates were not prepared in the key areas of employment such as critical thinking, writing, self-direction and global knowledge which are most needed for the purposes of promotion. Further, critical thinking skills are deemed as integral parts of transformational pedagogy which aims at having a more informed society. To add to this, literature teaching has recently been advocated for enhancing students’ critical thinking and reasoning. Thus this study explored the effects of incorporating a few strategies in teaching literature, namely a Shakespeare play, into a course design to enhance these skills. An experiment involving a pretest and posttest using the California Critical Thinking Skills Test (CCTST) were administered on 80 first-year students enrolled in the Bachelor of Arts programme who were randomly assigned into the control group and experimental group. For the next 12 weeks, the experimental group was given intervention which included guided in-class discussion with Socratic questioning skills, learning log to detect their weaknesses in logical reasoning; presentations and quizzes. The results of CCTST which included paired T-test using SPSS version 22 indicated significant differences between the two groups. Findings have significant implications on the course design as well as pedagogical practice in using literature to enhance students’ critical thinking skills.

Keywords: literature teaching, critical thinking, California critical thinking skills test (CCTST), course design

Procedia PDF Downloads 457
8582 A Kernel-Based Method for MicroRNA Precursor Identification

Authors: Bin Liu

Abstract:

MicroRNAs (miRNAs) are small non-coding RNA molecules, functioning in transcriptional and post-transcriptional regulation of gene expression. The discrimination of the real pre-miRNAs from the false ones (such as hairpin sequences with similar stem-loops) is necessary for the understanding of miRNAs’ role in the control of cell life and death. Since both their small size and sequence specificity, it cannot be based on sequence information alone but requires structure information about the miRNA precursor to get satisfactory performance. Kmers are convenient and widely used features for modeling the properties of miRNAs and other biological sequences. However, Kmers suffer from the inherent limitation that if the parameter K is increased to incorporate long range effects, some certain Kmer will appear rarely or even not appear, as a consequence, most Kmers absent and a few present once. Thus, the statistical learning approaches using Kmers as features become susceptible to noisy data once K becomes large. In this study, we proposed a Gapped k-mer approach to overcome the disadvantages of Kmers, and applied this method to the field of miRNA prediction. Combined with the structure status composition, a classifier called imiRNA-GSSC was proposed. We show that compared to the original imiRNA-kmer and alternative approaches. Trained on human miRNA precursors, this predictor can achieve an accuracy of 82.34 for predicting 4022 pre-miRNA precursors from eleven species.

Keywords: gapped k-mer, imiRNA-GSSC, microRNA precursor, support vector machine

Procedia PDF Downloads 154
8581 Material Supply Mechanisms for Contemporary Assembly Systems

Authors: Rajiv Kumar Srivastava

Abstract:

Manufacturing of complex products such as automobiles and computers requires a very large number of parts and sub-assemblies. The design of mechanisms for delivery of these materials to the point of assembly is an important manufacturing system and supply chain challenge. Different approaches to this problem have been evolved for assembly lines designed to make large volumes of standardized products. However, contemporary assembly systems are required to concurrently produce a variety of products using approaches such as mixed model production, and at times even mass customization. In this paper we examine the material supply approaches for variety production in moderate to large volumes. The conventional approach for material delivery to high volume assembly lines is to supply and stock materials line-side. However for certain materials, especially when the same or similar items are used along the line, it is more convenient to supply materials in kits. Kitting becomes more preferable when lines concurrently produce multiple products in mixed model mode, since space requirements could increase as product/ part variety increases. At times such kits may travel along with the product, while in some situations it may be better to have delivery and station-specific kits rather than product-based kits. Further, in some mass customization situations it may even be better to have a single delivery and assembly station, to which an entire kit is delivered for fitment, rather than a normal assembly line. Finally, in low-moderate volume assembly such as in engineered machinery, it may be logistically more economical to gather materials in an order-specific kit prior to launching final assembly. We have studied material supply mechanisms to support assembly systems as observed in case studies of firms with different combinations of volume and variety/ customization. It is found that the appropriate approach tends to be a hybrid between direct line supply and different kitting modes, with the best mix being a function of the manufacturing and supply chain environment, as well as space and handling considerations. In our continuing work we are studying these scenarios further, through the use of descriptive models and progressing towards prescriptive models to help achieve the optimal approach, capturing the trade-offs between inventory, material handling, space, and efficient line supply.

Keywords: assembly systems, kitting, material supply, variety production

Procedia PDF Downloads 219
8580 Comparison of Air Quality in 2019 and 2020 in the Campuses of the University of the Basque Country

Authors: Elisabete Alberdi, Irantzu Álvarez, Nerea Astigarraga, Heber Hernández

Abstract:

The purpose of this research work is to study the emissions of certain substances that contribute to air pollution and, as far as possible, to try to eliminate or reduce them, to avoid damage to both health and the environment. This work focuses on analyzing and comparing air quality in 2019 and 2020 in the Autonomous Community of the Basque Country, especially near the UPV/EHU campuses. We use Geostatistics to develop a spatial model and to analyse the levels of pollutants in those areas where the scope of the monitoring stations is limited. Finally, different more sustainable transport alternatives for users have been proposed.

Keywords: air quality, pollutants, monitoring stations, environment, geostatistics

Procedia PDF Downloads 169
8579 Toward Green Infrastructure Development: Dispute Prevention Mechanisms along the Belt and Road and Beyond

Authors: Shahla Ali

Abstract:

In the context of promoting green infrastructure development, new opportunities are emerging to re-examine sustainable development practices. This paper presents an initial exploration of the development of community-investor dispute prevention and facilitation mechanisms in the context of the Belt and Road Initiative (BRI) spanning Asia, Africa, and Europe. Given the widescale impact of China’s multi-jurisdictional development initiative, learning how to coordinate with local communities is vital to realizing inclusive and sustainable growth. In the 20 years since the development of the first multilateral community-investor dispute resolution mechanism developed by the International Finance Centre/World Bank, much has been learned about public facilitation, community engagement, and dispute prevention during the early stages of major infrastructure development programs. This paper will explore initial findings as they relate to initiatives underway along the BRI within the Asian Infrastructure Investment Bank and the Asian Development Bank. Given the borderless nature of sustainability concerns, insights from diverse regions are critical to deepening insights into best practices. Drawing on a case-based methodology, this paper will explore the achievements, challenges, and lessons learned in community-investor dispute prevention and resolution for major infrastructure projects in the greater China region.

Keywords: law and development, dispute prevention, sustainable development, mitigation

Procedia PDF Downloads 98
8578 High Resolution Image Generation Algorithm for Archaeology Drawings

Authors: Xiaolin Zeng, Lei Cheng, Zhirong Li, Xueping Liu

Abstract:

Aiming at the problem of low accuracy and susceptibility to cultural relic diseases in the generation of high-resolution archaeology drawings by current image generation algorithms, an archaeology drawings generation algorithm based on a conditional generative adversarial network is proposed. An attention mechanism is added into the high-resolution image generation network as the backbone network, which enhances the line feature extraction capability and improves the accuracy of line drawing generation. A dual-branch parallel architecture consisting of two backbone networks is implemented, where the semantic translation branch extracts semantic features from orthophotographs of cultural relics, and the gradient screening branch extracts effective gradient features. Finally, the fusion fine-tuning module combines these two types of features to achieve the generation of high-quality and high-resolution archaeology drawings. Experimental results on the self-constructed archaeology drawings dataset of grotto temple statues show that the proposed algorithm outperforms current mainstream image generation algorithms in terms of pixel accuracy (PA), structural similarity (SSIM), and peak signal-to-noise ratio (PSNR) and can be used to assist in drawing archaeology drawings.

Keywords: archaeology drawings, digital heritage, image generation, deep learning

Procedia PDF Downloads 45
8577 A Comprehensive Planning Model for Amalgamation of Intensification and Green Infrastructure

Authors: Sara Saboonian, Pierre Filion

Abstract:

The dispersed-suburban model has been the dominant one across North America for the past seventy years, characterized by automobile reliance, low density, and land-use specialization. Two planning models have emerged as possible alternatives to address the ills inflicted by this development pattern. First, there is intensification, which promotes efficient infrastructure by connecting high-density, multi-functional, and walkable nodes with public transit services within the suburban landscape. Second is green infrastructure, which provides environmental health and human well-being by preserving and restoring ecosystem services. This research studies incompatibilities and the possibility of amalgamating the two alternatives in an attempt to develop a comprehensive alternative to suburban model that advocates density, multi-functionality and transit- and pedestrian-conduciveness, with measures capable of mitigating the adverse environmental impacts of compactness. The research investigates three Canadian urban growth centers, where intensification is the current planning practice, and the awareness of green infrastructure benefits is on the rise. However, these three centers are contrasted by their development stage, the presence or absence of protected natural land, their environmental approach, and their adverse environmental consequences according to the planning cannons of different periods. The methods include reviewing the literature on green infrastructure planning, criticizing the Ontario provincial plans for intensification, surveying residents’ preferences for alternative models, and interviewing officials who deal with the local planning for the centers. Moreover, the research draws on recalling debates between New Urbanism and Landscape/Ecological Urbanism. The case studies expose the difficulties in creating urban growth centres that accommodate green infrastructure while adhering to intensification principles. First, the dominant status of intensification and the obstacles confronting intensification have monopolized the planners’ concerns. Second, the tension between green infrastructure and intensification explains the absence of the green infrastructure typologies that correspond to intensification-compatible forms and dynamics. Finally, the lack of highlighted social-economic benefits of green infrastructure reduces residents’ participation. Moreover, the results from the research provide insight into predominating urbanization theories, New Urbanism and Landscape/Ecological Urbanism. In order to understand political, planning, and ecological dynamics of such blending, dexterous context-specific planning is required. Findings suggest the influence of the following factors on amalgamating intensification and green infrastructure. Initially, producing ecosystem services-based justifications for green infrastructure development in the intensification context provides an expert-driven backbone for the implementation programs. This knowledge-base should be translated to effectively imbue different urban stakeholders. Moreover, due to the limited greenfields in intensified areas, spatial distribution and development of multi-level corridors such as pedestrian-hospitable settings and transportation networks along green infrastructure measures are required. Finally, to ensure the long-term integrity of implemented green infrastructure measures, significant investment in public engagement and education, as well as clarification of management responsibilities is essential.

Keywords: ecosystem services, green infrastructure, intensification, planning

Procedia PDF Downloads 348
8576 Literature Review and Evaluation of the Internal Marketing Theory

Authors: Hsiao Hsun Yuan

Abstract:

Internal marketing was proposed in 1970s. The theory of the concept has continually changed over the past forty years. This study discussed the following themes: the definition and implication of internal marketing, the progress of its development, and the evolution of its theoretical model. Moreover, the study systematically organized the strategies of the internal marketing theory adopted on enterprise and how they were put into practice. It also compared the empirical studies focusing on how the existent theories influenced the important variables of internal marketing. The results of this study are expected to serve as references for future exploration of the boundary and studies aiming at how internal marketing is applied to different types of enterprises.

Keywords: corporate responsibility, employee organizational performance, internal marketing, internal customer

Procedia PDF Downloads 346
8575 Management and Marketing Implications of Tourism Gravity Models

Authors: Clive L. Morley

Abstract:

Gravity models and panel data modelling of tourism flows are receiving renewed attention, after decades of general neglect. Such models have quite different underpinnings from conventional demand models derived from micro-economic theory. They operate at a different level of data and with different theoretical bases. These differences have important consequences for the interpretation of the results and their policy and managerial implications. This review compares and contrasts the two model forms, clarifying the distinguishing features and the estimation requirements of each. In general, gravity models are not recommended for use to address specific management and marketing purposes.

Keywords: gravity models, micro-economics, demand models, marketing

Procedia PDF Downloads 433
8574 Determining Optimal Number of Trees in Random Forests

Authors: Songul Cinaroglu

Abstract:

Background: Random Forest is an efficient, multi-class machine learning method using for classification, regression and other tasks. This method is operating by constructing each tree using different bootstrap sample of the data. Determining the number of trees in random forests is an open question in the literature for studies about improving classification performance of random forests. Aim: The aim of this study is to analyze whether there is an optimal number of trees in Random Forests and how performance of Random Forests differ according to increase in number of trees using sample health data sets in R programme. Method: In this study we analyzed the performance of Random Forests as the number of trees grows and doubling the number of trees at every iteration using “random forest” package in R programme. For determining minimum and optimal number of trees we performed Mc Nemar test and Area Under ROC Curve respectively. Results: At the end of the analysis it was found that as the number of trees grows, it does not always means that the performance of the forest is better than forests which have fever trees. In other words larger number of trees only increases computational costs but not increases performance results. Conclusion: Despite general practice in using random forests is to generate large number of trees for having high performance results, this study shows that increasing number of trees doesn’t always improves performance. Future studies can compare different kinds of data sets and different performance measures to test whether Random Forest performance results change as number of trees increase or not.

Keywords: classification methods, decision trees, number of trees, random forest

Procedia PDF Downloads 390
8573 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference

Authors: Hussein Alahmer, Amr Ahmed

Abstract:

Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate.  This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.

Keywords: CAD system, difference of feature, fuzzy c means, lesion detection, liver segmentation

Procedia PDF Downloads 314
8572 Navigating through Organizational Change: TAM-Based Manual for Digital Skills and Safety Transitions

Authors: Margarida Porfírio Tomás, Paula Pereira, José Palma Oliveira

Abstract:

Robotic grasping is advancing rapidly, but transferring techniques from rigid to deformable objects remains a challenge. Deformable and flexible items, such as food containers, demand nuanced handling due to their changing shapes. Bridging this gap is crucial for applications in food processing, surgical robotics, and household assistance. AGILEHAND, a Horizon project, focuses on developing advanced technologies for sorting, handling, and packaging soft and deformable products autonomously. These technologies serve as strategic tools to enhance flexibility, agility, and reconfigurability within the production and logistics systems of European manufacturing companies. Key components include intelligent detection, self-adaptive handling, efficient sorting, and agile, rapid reconfiguration. The overarching goal is to optimize work environments and equipment, ensuring both efficiency and safety. As new technologies emerge in the food industry, there will be some implications, such as labour force, safety problems and acceptance of the new technologies. To overcome these implications, AGILEHAND emphasizes the integration of social sciences and humanities, for example, the application of the Technology Acceptance Model (TAM). The project aims to create a change management manual, that will outline strategies for developing digital skills and managing health and safety transitions. It will also provide best practices and models for organizational change. Additionally, AGILEHAND will design effective training programs to enhance employee skills and knowledge. This information will be obtained through a combination of case studies, structured interviews, questionnaires, and a comprehensive literature review. The project will explore how organizations adapt during periods of change and identify factors influencing employee motivation and job satisfaction. This project received funding from European Union’s Horizon 2020/Horizon Europe research and innovation program under grant agreement No101092043 (AGILEHAND).

Keywords: change management, technology acceptance model, organizational change, health and safety

Procedia PDF Downloads 37
8571 Intellectual Capital and Transparency in Universities: An Empirical Study

Authors: Yolanda Ramirez, Angel Tejada, Agustin Baidez

Abstract:

This paper shows the general perceptions of Spanish university stakeholders in relation to the university’s annual reports and the adequacy and potential of intellectual capital reporting. To this end, a questionnaire was designed and sent to every member of the Social Councils of Spanish public universities. It was thought that these participants would provide a good example of the attitude of university stakeholders since they represent the different social groups connected with universities. From the results of this study we are in the position of confirming the need for universities to offer information on intellectual capital in their accounting information model.

Keywords: intellectual capital, disclosure, stakeholders, universities, annual report

Procedia PDF Downloads 490
8570 Design of a Fuzzy Luenberger Observer for Fault Nonlinear System

Authors: Mounir Bekaik, Messaoud Ramdani

Abstract:

We present in this work a new technique of stabilization for fault nonlinear systems. The approach we adopt focus on a fuzzy Luenverger observer. The T-S approximation of the nonlinear observer is based on fuzzy C-Means clustering algorithm to find local linear subsystems. The MOESP identification approach was applied to design an empirical model describing the subsystems state variables. The gain of the observer is given by the minimization of the estimation error through Lyapunov-krasovskii functional and LMI approach. We consider a three tank hydraulic system for an illustrative example.

Keywords: nonlinear system, fuzzy, faults, TS, Lyapunov-Krasovskii, observer

Procedia PDF Downloads 321
8569 The Political Pedagogy of Everyday Life in the French Revolution

Authors: Michael Ruiz

Abstract:

Many scholars view the French Revolution as the origins of ‘modern nationalism,’ citing the unprecedented rhetorical power of ‘the nation’ and the emergence of a centralized, modern nation-state during this time. They have also stressed the role of public education in promoting a national language and creating a sense of shared national identity among the masses. Yet as many cultural historians have shown, revolutionary leaders undertook an unprecedented campaign to overhaul French culture in the 1790s in order to cultivate these national ideals and inspire Republican virtues, in what has been called ‘political pedagogy.’ In contrast to scholars of nationalism, who emphasize formal education, revolutionaries attempted to translate abstract ideas of equality and liberty into palpable representations that would inundate everyday life, thereby serving as pedagogical tools. Material culture and everyday life became state apparatuses not just for winning over citizens’ hearts and minds, but for influencing the very formation of the citizen and their innermost ‘self.’ This paper argues that nationalism began in 1789, when ‘the self’ became a political concern and its formation a state project for cultivating political legitimacy. By broadening the meaning of ‘political pedagogy,’ this study brings together scholarship on nationalism with cultural history, thereby highlighting nations and nationalism as banal, palpable, quotidian phenomena and historicizing the complex emergence of ‘modern nationalism.’ Moreover, because the contemporary view of material culture and pedagogy was highly gendered, this study shows the role of culture in the development of a homosocial, male-dominated public sphere in the 19th century. The legacy of the French Revolution’s concern with culture thus persists as much in our vocabulary for political expression as it does in the material world, remaining deeply embedded in everyday day life as a crucial, nearly-invisible, component of nationalism.

Keywords: French Revolution, nationalism, political culture, material culture

Procedia PDF Downloads 131
8568 Towards a Vulnerability Model Assessment of The Alexandra Jukskei Catchment in South Africa

Authors: Vhuhwavho Gadisi, Rebecca Alowo, German Nkhonjera

Abstract:

This article sets out to detail an investigation of groundwater management in the Juksei Catchment of South Africa through spatial mapping of key hydrological relationships, interactions, and parameters in catchments. The Department of Water Affairs (DWA) noted gaps in the implementation of the South African National Water Act 1998: article 16, including the lack of appropriate models for dealing with water quantity parameters. For this reason, this research conducted a drastic GIS-based groundwater assessment to improve groundwater monitoring system in the Juksei River basin catchment of South Africa. The methodology employed was a mixed-methods approach/design that involved the use of DRASTIC analysis, questionnaire, literature review and observations to gather information on how to help people who use the Juskei River. GIS (geographical information system) mapping was carried out using a three-parameter DRASTIC (Depth to water, Recharge, Aquifer media, Soil media, Topography, Impact of the vadose zone, Hydraulic conductivity) vulnerability methodology. In addition, the developed vulnerability map was subjected to sensitivity analysis as a validation method. This approach included single-parameter sensitivity, sensitivity to map deletion, and correlation analysis of DRASTIC parameters. The findings were that approximately 5.7% (45km2) of the area in the northern part of the Juksei watershed is highly vulnerable. Approximately 53.6% (428.8 km^2) of the basin is also at high risk of groundwater contamination. This area is mainly located in the central, north-eastern, and western areas of the sub-basin. The medium and low vulnerability classes cover approximately 18.1% (144.8 km2) and 21.7% (168 km2) of the Jukskei River, respectively. The shallow groundwater of the Jukskei River belongs to a very vulnerable area. Sensitivity analysis indicated that water depth, water recharge, aquifer environment, soil, and topography were the main factors contributing to the vulnerability assessment. The conclusion is that the final vulnerability map indicates that the Juksei catchment is highly susceptible to pollution, and therefore, protective measures are needed for sustainable management of groundwater resources in the study area.

Keywords: contamination, DRASTIC, groundwater, vulnerability, model

Procedia PDF Downloads 80