Search results for: problem-based learning approach
14570 An Approach to Improve Pre University Students' Responsible Environmental Behaviour through Science Writing Heuristic in Malaysia
Authors: Sheila Shamuganathan, Mageswary Karpudewan
Abstract:
This study investigated the effectiveness of green chemistry integrated with Science Writing Heuristic (SWH) in enhancing matriculation students’ responsible environmental behaviour. For this purpose 207 matriculation students were randomly assigned into experimental (N=118) and control (N=89) group. For the experimental group the chemistry concepts were taught using the instructional approach of green chemistry integrated with Science Writing Heuristic (SWH) while for the control group the same content was taught using green chemistry. The data was analysed using ANCOVA and findings obtained from the quantitative analysis reveals that there is significant changes in responsible environmental behaviour (F 1,204) = 32.13 (ηp² = 0.14) which favours the experimental group. The responses of the qualitative data obtained from an interview with the experimental group also further strengthen and indicated a significant improvement in responsible environmental behaviour. The outcome of the study suggests that using green chemistry integrated with Science Writing Heuristic (SWH) could be an alternative approach to improve students’ responsible environmental behaviour towards the environment.Keywords: science writing heuristic, green chemistry, pro environmental behaviour, laboratory
Procedia PDF Downloads 32414569 Comparing Two Interventions for Teaching Math to Pre-School Students with Autism
Authors: Hui Fang Huang Su, Jia Borror
Abstract:
This study compared two interventions for teaching math to preschool-aged students with autism spectrum disorder (ASD). The first is considered the business as usual (BAU) intervention, which uses the Strategies for Teaching Based on Autism Research (STAR) curriculum and discrete trial teaching as the instructional methodology. The second is the Math is Not Difficult (Project MIND) activity-embedded, naturalistic intervention. These interventions were randomly assigned to four preschool students with ASD classrooms and implemented over three months for Project Mind. We used measurement gained during the same three months for the STAR intervention. In addition, we used A quasi-experimental, pre-test/post-test design to compare the effectiveness of these two interventions in building mathematical knowledge and skills. The pre-post measures include three standardized instruments: the Test of Early Math Ability-3, the Problem Solving and Calculation subtests of the Woodcock-Johnson Test of Achievement IV, and the Bracken Test of Basic Concepts-3 Receptive. The STAR curriculum-based assessment is administered to all Baudhuin students three times per year, and we used the results in this study. We anticipated that implementing these two approaches would improve the mathematical knowledge and skills of children with ASD. Still, it is crucial to see whether a behavioral or naturalistic teaching approach leads to more significant results.Keywords: early learning, autism, math for pre-schoolers, special education, teaching strategies
Procedia PDF Downloads 16914568 Low-Cost, Portable Optical Sensor with Regression Algorithm Models for Accurate Monitoring of Nitrites in Environments
Authors: David X. Dong, Qingming Zhang, Meng Lu
Abstract:
Nitrites enter waterways as runoff from croplands and are discharged from many industrial sites. Excessive nitrite inputs to water bodies lead to eutrophication. On-site rapid detection of nitrite is of increasing interest for managing fertilizer application and monitoring water source quality. Existing methods for detecting nitrites use spectrophotometry, ion chromatography, electrochemical sensors, ion-selective electrodes, chemiluminescence, and colorimetric methods. However, these methods either suffer from high cost or provide low measurement accuracy due to their poor selectivity to nitrites. Therefore, it is desired to develop an accurate and economical method to monitor nitrites in environments. We report a low-cost optical sensor, in conjunction with a machine learning (ML) approach to enable high-accuracy detection of nitrites in water sources. The sensor works under the principle of measuring molecular absorptions of nitrites at three narrowband wavelengths (295 nm, 310 nm, and 357 nm) in the ultraviolet (UV) region. These wavelengths are chosen because they have relatively high sensitivity to nitrites; low-cost light-emitting devices (LEDs) and photodetectors are also available at these wavelengths. A regression model is built, trained, and utilized to minimize cross-sensitivities of these wavelengths to the same analyte, thus achieving precise and reliable measurements with various interference ions. The measured absorbance data is input to the trained model that can provide nitrite concentration prediction for the sample. The sensor is built with i) a miniature quartz cuvette as the test cell that contains a liquid sample under test, ii) three low-cost UV LEDs placed on one side of the cell as light sources, with each LED providing a narrowband light, and iii) a photodetector with a built-in amplifier and an analog-to-digital converter placed on the other side of the test cell to measure the power of transmitted light. This simple optical design allows measuring the absorbance data of the sample at the three wavelengths. To train the regression model, absorbances of nitrite ions and their combination with various interference ions are first obtained at the three UV wavelengths using a conventional spectrophotometer. Then, the spectrophotometric data are inputs to different regression algorithm models for training and evaluating high-accuracy nitrite concentration prediction. Our experimental results show that the proposed approach enables instantaneous nitrite detection within several seconds. The sensor hardware costs about one hundred dollars, which is much cheaper than a commercial spectrophotometer. The ML algorithm helps to reduce the average relative errors to below 3.5% over a concentration range from 0.1 ppm to 100 ppm of nitrites. The sensor has been validated to measure nitrites at three sites in Ames, Iowa, USA. This work demonstrates an economical and effective approach to the rapid, reagent-free determination of nitrites with high accuracy. The integration of the low-cost optical sensor and ML data processing can find a wide range of applications in environmental monitoring and management.Keywords: optical sensor, regression model, nitrites, water quality
Procedia PDF Downloads 7614567 Improved Super-Resolution Using Deep Denoising Convolutional Neural Network
Authors: Pawan Kumar Mishra, Ganesh Singh Bisht
Abstract:
Super-resolution is the technique that is being used in computer vision to construct high-resolution images from a single low-resolution image. It is used to increase the frequency component, recover the lost details and removing the down sampling and noises that caused by camera during image acquisition process. High-resolution images or videos are desired part of all image processing tasks and its analysis in most of digital imaging application. The target behind super-resolution is to combine non-repetition information inside single or multiple low-resolution frames to generate a high-resolution image. Many methods have been proposed where multiple images are used as low-resolution images of same scene with different variation in transformation. This is called multi-image super resolution. And another family of methods is single image super-resolution that tries to learn redundancy that presents in image and reconstruction the lost information from a single low-resolution image. Use of deep learning is one of state of art method at present for solving reconstruction high-resolution image. In this research, we proposed Deep Denoising Super Resolution (DDSR) that is a deep neural network for effectively reconstruct the high-resolution image from low-resolution image.Keywords: resolution, deep-learning, neural network, de-blurring
Procedia PDF Downloads 51914566 Understanding the Programming Techniques Using a Complex Case Study to Teach Advanced Object-Oriented Programming
Authors: M. Al-Jepoori, D. Bennett
Abstract:
Teaching Object-Oriented Programming (OOP) as part of a Computing-related university degree is a very difficult task; the road to ensuring that students are actually learning object oriented concepts is unclear, as students often find it difficult to understand the concept of objects and their behavior. This problem is especially obvious in advanced programming modules where Design Pattern and advanced programming features such as Multi-threading and animated GUI are introduced. Looking at the students’ performance at their final year on a university course, it was obvious that the level of students’ understanding of OOP varies to a high degree from one student to another. Students who aim at the production of Games do very well in the advanced programming module. However, the students’ assessment results of the last few years were relatively low; for example, in 2016-2017, the first quartile of marks were as low as 24.5 and the third quartile was 63.5. It is obvious that many students were not confident or competent enough in their programming skills. In this paper, the reasons behind poor performance in Advanced OOP modules are investigated, and a suggested practice for teaching OOP based on a complex case study is described and evaluated.Keywords: complex programming case study, design pattern, learning advanced programming, object oriented programming
Procedia PDF Downloads 22414565 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)
Authors: Gule Teri
Abstract:
The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing
Procedia PDF Downloads 8514564 A Fuzzy Decision Making Approach for Supplier Selection in Healthcare Industry
Authors: Zeynep Sener, Mehtap Dursun
Abstract:
Supplier evaluation and selection is one of the most important components of an effective supply chain management system. Due to the expanding competition in healthcare, selecting the right medical device suppliers offers great potential for increasing quality while decreasing costs. This paper proposes a fuzzy decision making approach for medical supplier selection. A real-world medical device supplier selection problem is presented to illustrate the application of the proposed decision methodology.Keywords: fuzzy decision making, fuzzy multiple objective programming, medical supply chain, supplier selection
Procedia PDF Downloads 45614563 Environmental Radioactivity Analysis by a Sequential Approach
Authors: G. Medkour Ishak-Boushaki, A. Taibi, M. Allab
Abstract:
Quantitative environmental radioactivity measurements are needed to determine the level of exposure of a population to ionizing radiations and for the assessment of the associated risks. Gamma spectrometry remains a very powerful tool for the analysis of radionuclides present in an environmental sample but the basic problem in such measurements is the low rate of detected events. Using large environmental samples could help to get around this difficulty but, unfortunately, new issues are raised by gamma rays attenuation and self-absorption. Recently, a new method has been suggested, to detect and identify without quantification, in a short time, a gamma ray of a low count source. This method does not require, as usually adopted in gamma spectrometry measurements, a pulse height spectrum acquisition. It is based on a chronological record of each detected photon by simultaneous measurements of its energy ε and its arrival time τ on the detector, the pair parameters [ε,τ] defining an event mode sequence (EMS). The EMS serials are analyzed sequentially by a Bayesian approach to detect the presence of a given radioactive source. The main object of the present work is to test the applicability of this sequential approach in radioactive environmental materials detection. Moreover, for an appropriate health oversight of the public and of the concerned workers, the analysis has been extended to get a reliable quantification of the radionuclides present in environmental samples. For illustration, we consider as an example, the problem of detection and quantification of 238U. Monte Carlo simulated experience is carried out consisting in the detection, by a Ge(Hp) semiconductor junction, of gamma rays of 63 keV emitted by 234Th (progeny of 238U). The generated EMS serials are analyzed by a Bayesian inference. The application of the sequential Bayesian approach, in environmental radioactivity analysis, offers the possibility of reducing the measurements time without requiring large environmental samples and consequently avoids the attached inconvenient. The work is still in progress.Keywords: Bayesian approach, event mode sequence, gamma spectrometry, Monte Carlo method
Procedia PDF Downloads 50114562 Sustainable Manufacturing Industries and Energy-Water Nexus Approach
Authors: Shahbaz Abbas, Lin Han Chiang Hsieh
Abstract:
The significant population growth and climate change issues have contributed to the natural resources depletion and their sustainability in the future. Manufacturing industries have a substantial impact on every country’s economy, but the sustainability of the industrial resources is challenging, and the policymakers have been developing the possible solutions to manage the sustainability of industrial resources such as raw material, energy, water, and industrial supply chain. In order to address these challenges, nexus approach is one of the optimization and modelling techniques in the recent sustainable environmental research. The interactions between the nexus components acknowledge that all components are dependent upon each other, and they are interrelated; therefore, their sustainability is also associated with each other. In addition, the nexus concept does not only provide the resources sustainability but also environmental sustainability can be achieved through nexus approach by utilizing the industrial waste as a resource for the industrial processes. Based on energy-water nexus, this study has developed a resource-energy-water for the sugar industry to understand the interactions between sugarcane, energy, and water towards the sustainable sugar industry. In particular, the focus of the research is the Taiwanese sugar industry; however, the same approach can be adapted worldwide to optimize the sustainability of sugar industries. It has been concluded that there are significant interactions between sugarcane, energy consumption, and water consumption in the sugar industry to manage the scarcity of resources in the future. The interactions between sugarcane and energy also deliver a mechanism to reuse the sugar industrial waste as a source of energy, consequently validating industrial and environmental sustainability. The desired outcomes from the nexus can be achieved with the modifications in the policy and regulations of Taiwanese industrial sector.Keywords: energy-water nexus, environmental sustainability, industrial sustainability, natural resource management
Procedia PDF Downloads 12914561 Classification of Coughing and Breathing Activities Using Wearable and a Light-Weight DL Model
Authors: Subham Ghosh, Arnab Nandi
Abstract:
Background: The proliferation of Wireless Body Area Networks (WBAN) and Internet of Things (IoT) applications demonstrates the potential for continuous monitoring of physical changes in the body. These technologies are vital for health monitoring tasks, such as identifying coughing and breathing activities, which are necessary for disease diagnosis and management. Monitoring activities such as coughing and deep breathing can provide valuable insights into a variety of medical issues. Wearable radio-based antenna sensors, which are lightweight and easy to incorporate into clothing or portable goods, provide continuous monitoring. This mobility gives it a substantial advantage over stationary environmental sensors like as cameras and radar, which are constrained to certain places. Furthermore, using compressive techniques provides benefits such as reduced data transmission speeds and memory needs. These wearable sensors offer more advanced and diverse health monitoring capabilities. Methodology: This study analyzes the feasibility of using a semi-flexible antenna operating at 2.4 GHz (ISM band) and positioned around the neck and near the mouth to identify three activities: coughing, deep breathing, and idleness. Vector network analyzer (VNA) is used to collect time-varying complex reflection coefficient data from perturbed antenna nearfield. The reflection coefficient (S11) conveys nuanced information caused by simultaneous variations in the nearfield radiation of three activities across time. The signatures are sparsely represented with gaussian windowed Gabor spectrograms. The Gabor spectrogram is used as a sparse representation approach, which reassigns the ridges of the spectrogram images to improve their resolution and focus on essential components. The antenna is biocompatible in terms of specific absorption rate (SAR). The sparsely represented Gabor spectrogram pictures are fed into a lightweight deep learning (DL) model for feature extraction and classification. Two antenna locations are investigated in order to determine the most effective localization for three different activities. Findings: Cross-validation techniques were used on data from both locations. Due to the complex form of the recorded S11, separate analyzes and assessments were performed on the magnitude, phase, and their combination. The combination of magnitude and phase fared better than the separate analyses. Various sliding window sizes, ranging from 1 to 5 seconds, were tested to find the best window for activity classification. It was discovered that a neck-mounted design was effective at detecting the three unique behaviors.Keywords: activity recognition, antenna, deep-learning, time-frequency
Procedia PDF Downloads 1814560 Integrated Modeling Approach for Energy Planning and Climate Change Mitigation Assessment in the State of Florida
Authors: K. Thakkar, C. Ghenai
Abstract:
An integrated modeling approach was used in this study to (1) track energy consumption, production, and resource extraction, (2) track greenhouse gases emissions and (3) analyze emissions for local and regional air pollutions. The model was used in this study for short and long term energy and GHG emissions reduction analysis for the state of Florida. The integrated modeling methodology will help to evaluate the alternative energy scenarios and examine emissions-reduction strategies. The mitigation scenarios have been designed to describe the future energy strategies. They consist of various demand and supply side scenarios. One of the GHG mitigation scenarios is crafted by taking into account the available renewable resources potential for power generation in the state of Florida to compare and analyze the GHG reduction measure against ‘Business As Usual’ and ‘Florida State Policy’ scenario. Two more ‘integrated’ scenarios, (‘Electrification’ and ‘Efficiency and Lifestyle’) are crafted through combination of various mitigation scenarios to assess the cumulative impact of the reduction measures such as technological changes and energy efficiency and conservation.Keywords: energy planning, climate change mitigation assessment, integrated modeling approach, energy alternatives, and GHG emission reductions
Procedia PDF Downloads 44714559 The Mentoring in Professional Development of University Teachers
Authors: Nagore Guerra Bilbao, Clemente Lobato Fraile
Abstract:
Mentoring is provided by professionals with a higher level of experience and competence as part of the professional development of a university faculty. This paper explores the characteristics of the mentoring provided by those teachers participating in the development of an active methodology program run at the University of the Basque Country: to examine and to analyze mentors’ performance with the aim of providing empirical evidence regarding its value as a lifelong learning strategy for teaching staff. A total of 183 teachers were trained during the first three programs. The analysis method uses a coding technique and is based on flexible, systematic guidelines for gathering and analyzing qualitative data. The results have confirmed the conception of mentoring as a methodological innovation in higher education. In short, university teachers in general assessed the mentoring they received positively, considering it to be a valid, useful strategy in their professional development. They highlighted the methodological expertise of their mentor and underscored how they monitored the learning process of the active method and provided guidance and advice when necessary. Finally, they also drew attention to traits such as availability, personal commitment and flexibility in. However, a minority critique is pointed to some aspects of the performance of some mentors.Keywords: higher education, mentoring, professional development, university teachers
Procedia PDF Downloads 24514558 Thresholding Approach for Automatic Detection of Pseudomonas aeruginosa Biofilms from Fluorescence in situ Hybridization Images
Authors: Zonglin Yang, Tatsuya Akiyama, Kerry S. Williamson, Michael J. Franklin, Thiruvarangan Ramaraj
Abstract:
Pseudomonas aeruginosa is an opportunistic pathogen that forms surface-associated microbial communities (biofilms) on artificial implant devices and on human tissue. Biofilm infections are difficult to treat with antibiotics, in part, because the bacteria in biofilms are physiologically heterogeneous. One measure of biological heterogeneity in a population of cells is to quantify the cellular concentrations of ribosomes, which can be probed with fluorescently labeled nucleic acids. The fluorescent signal intensity following fluorescence in situ hybridization (FISH) analysis correlates to the cellular level of ribosomes. The goals here are to provide computationally and statistically robust approaches to automatically quantify cellular heterogeneity in biofilms from a large library of epifluorescent microscopy FISH images. In this work, the initial steps were developed toward these goals by developing an automated biofilm detection approach for use with FISH images. The approach allows rapid identification of biofilm regions from FISH images that are counterstained with fluorescent dyes. This methodology provides advances over other computational methods, allowing subtraction of spurious signals and non-biological fluorescent substrata. This method will be a robust and user-friendly approach which will enable users to semi-automatically detect biofilm boundaries and extract intensity values from fluorescent images for quantitative analysis of biofilm heterogeneity.Keywords: image informatics, Pseudomonas aeruginosa, biofilm, FISH, computer vision, data visualization
Procedia PDF Downloads 14014557 Empowering Girls and Youth in Bangladesh: Importance of Creating Safe Digital Space for Online Learning and Education
Authors: Md. Rasel Mia, Ashik Billah
Abstract:
The empowerment of girls and youth in Bangladesh is a demanding issue in today's digital age, where online learning and education have become integral to personal and societal development. This abstract explores the critical importance of creating a secure online environment for girls and youth in Bangladesh, emphasizing the transformative impact it can have on their access to education and knowledge. Bangladesh, like many developing nations, faces gender inequalities in education and access to digital resources. The creation of a safe digital space not only mitigates the gender digital divide but also fosters an environment where girls and youth can thrive academically and professionally. This manuscript draws attention to the efforts through a mixed-method study to assess the current digital landscape in Bangladesh, revealing disparities in phone and internet access, online practices, and awareness of cyber security among diverse demographic groups. Moreover, the study unveils the varying levels of familial support and barriers encountered by girls and youth in their quest for digital literacy. It emphasizes the need for tailored training programs that address specific learning needs while also advocating for enhanced internet accessibility, safe online practices, and inclusive online platforms. The manuscript culminates in a call for collaborative efforts among stakeholders, including NGOs, government agencies, and telecommunications companies, to implement targeted interventions that bridge the gender digital divide and pave the way for a brighter, more equitable future for girls and youth in Bangladesh. In conclusion, this research highlights the undeniable significance of creating a safe digital space as a catalyst for the empowerment of girls and youth in Bangladesh, ensuring that they not only access but excel in the online space, thereby contributing to their personal growth and the advancement of society as a whole.Keywords: collaboration, cyber security, digital literacy, digital resources, inclusiveness
Procedia PDF Downloads 6414556 Sensitivity Analysis during the Optimization Process Using Genetic Algorithms
Authors: M. A. Rubio, A. Urquia
Abstract:
Genetic algorithms (GA) are applied to the solution of high-dimensional optimization problems. Additionally, sensitivity analysis (SA) is usually carried out to determine the effect on optimal solutions of changes in parameter values of the objective function. These two analyses (i.e., optimization and sensitivity analysis) are computationally intensive when applied to high-dimensional functions. The approach presented in this paper consists in performing the SA during the GA execution, by statistically analyzing the data obtained of running the GA. The advantage is that in this case SA does not involve making additional evaluations of the objective function and, consequently, this proposed approach requires less computational effort than conducting optimization and SA in two consecutive steps.Keywords: optimization, sensitivity, genetic algorithms, model calibration
Procedia PDF Downloads 43914555 A Metaheuristic for the Layout and Scheduling Problem in a Job Shop Environment
Authors: Hernández Eva Selene, Reyna Mary Carmen, Rivera Héctor, Barragán Irving
Abstract:
We propose an approach that jointly addresses the layout of a facility and the scheduling of a sequence of jobs. In real production, these two problems are interrelated. However, they are treated separately in the literature. Our approach is an extension of the job shop problem with transportation delay, where the location of the machines is selected among possible sites. The model minimizes the makespan, using the short processing times rule with two algorithms; the first one considers all the permutations for the location of machines, and the second only a heuristic to select some specific permutations that reduces computational time. Some instances are proved and compared with literature.Keywords: layout problem, job shop scheduling problem, concurrent scheduling and layout problem, metaheuristic
Procedia PDF Downloads 61314554 A Kernel-Based Method for MicroRNA Precursor Identification
Authors: Bin Liu
Abstract:
MicroRNAs (miRNAs) are small non-coding RNA molecules, functioning in transcriptional and post-transcriptional regulation of gene expression. The discrimination of the real pre-miRNAs from the false ones (such as hairpin sequences with similar stem-loops) is necessary for the understanding of miRNAs’ role in the control of cell life and death. Since both their small size and sequence specificity, it cannot be based on sequence information alone but requires structure information about the miRNA precursor to get satisfactory performance. Kmers are convenient and widely used features for modeling the properties of miRNAs and other biological sequences. However, Kmers suffer from the inherent limitation that if the parameter K is increased to incorporate long range effects, some certain Kmer will appear rarely or even not appear, as a consequence, most Kmers absent and a few present once. Thus, the statistical learning approaches using Kmers as features become susceptible to noisy data once K becomes large. In this study, we proposed a Gapped k-mer approach to overcome the disadvantages of Kmers, and applied this method to the field of miRNA prediction. Combined with the structure status composition, a classifier called imiRNA-GSSC was proposed. We show that compared to the original imiRNA-kmer and alternative approaches. Trained on human miRNA precursors, this predictor can achieve an accuracy of 82.34 for predicting 4022 pre-miRNA precursors from eleven species.Keywords: gapped k-mer, imiRNA-GSSC, microRNA precursor, support vector machine
Procedia PDF Downloads 16514553 Impact of Curvatures in the Dike Line on Wave Run-up and Wave Overtopping, ConDike-Project
Authors: Malte Schilling, Mahmoud M. Rabah, Sven Liebisch
Abstract:
Wave run-up and overtopping are the relevant parameters for the dimensioning of the crest height of dikes. Various experimental as well as numerical studies have investigated these parameters under different boundary conditions (e.g. wave conditions, structure type). Particularly for the dike design in Europe, a common approach is formulated where wave and structure properties are parameterized. However, this approach assumes equal run-up heights and overtopping discharges along the longitudinal axis. However, convex dikes have a heterogeneous crest by definition. Hence, local differences in a convex dike line are expected to cause wave-structure interactions different to a straight dike. This study aims to assess both run-up and overtopping at convexly curved dikes. To cast light on the relevance of curved dikes for the design approach mentioned above, physical model tests were conducted in a 3D wave basin of the Ludwig-Franzius-Institute Hannover. A dike of a slope of 1:6 (height over length) was tested under both regular waves and TMA wave spectra. Significant wave heights ranged from 7 to 10 cm and peak periods from 1.06 to 1.79 s. Both run-up and overtopping was assessed behind the curved and straight sections of the dike. Both measurements were compared to a dike with a straight line. It was observed that convex curvatures in the longitudinal dike line cause a redirection of incident waves leading to a concentration around the center point. Measurements prove that both run-up heights and overtopping rates are higher than on the straight dike. It can be concluded that deviations from a straight longitudinal dike line have an impact on design parameters and imply uncertainties within the design approach in force. Therefore, it is recommended to consider these influencing factors for such cases.Keywords: convex dike, longitudinal curvature, overtopping, run-up
Procedia PDF Downloads 29514552 Bioinformatics Approach to Identify Physicochemical and Structural Properties Associated with Successful Cell-free Protein Synthesis
Authors: Alexander A. Tokmakov
Abstract:
Cell-free protein synthesis is widely used to synthesize recombinant proteins. It allows genome-scale expression of various polypeptides under strictly controlled uniform conditions. However, only a minor fraction of all proteins can be successfully expressed in the systems of protein synthesis that are currently used. The factors determining expression success are poorly understood. At present, the vast volume of data is accumulated in cell-free expression databases. It makes possible comprehensive bioinformatics analysis and identification of multiple features associated with successful cell-free expression. Here, we describe an approach aimed at identification of multiple physicochemical and structural properties of amino acid sequences associated with protein solubility and aggregation and highlight major correlations obtained using this approach. The developed method includes: categorical assessment of the protein expression data, calculation and prediction of multiple properties of expressed amino acid sequences, correlation of the individual properties with the expression scores, and evaluation of statistical significance of the observed correlations. Using this approach, we revealed a number of statistically significant correlations between calculated and predicted features of protein sequences and their amenability to cell-free expression. It was found that some of the features, such as protein pI, hydrophobicity, presence of signal sequences, etc., are mostly related to protein solubility, whereas the others, such as protein length, number of disulfide bonds, content of secondary structure, etc., affect mainly the expression propensity. We also demonstrated that amenability of polypeptide sequences to cell-free expression correlates with the presence of multiple sites of post-translational modifications. The correlations revealed in this study provide a plethora of important insights into protein folding and rationalization of protein production. The developed bioinformatics approach can be of practical use for predicting expression success and optimizing cell-free protein synthesis.Keywords: bioinformatics analysis, cell-free protein synthesis, expression success, optimization, recombinant proteins
Procedia PDF Downloads 42214551 Gamification in Education: A Case Study on the Use of Serious Games
Authors: Maciej Zareba, Pawel Dawid
Abstract:
This article provides a case study exploring the use of serious games in educational settings, indicating their potential to transform conventional teaching methods into interactive and engaging learning experiences. By incorporating game elements such as points, leaderboards and progress indicators, serious games establish clear goals, provide real-time feedback and give a sense of progress. These elements enable students to solve complex problems in simulated environments, fostering critical thinking, creativity and contextual learning. The article provides a case study of the feasibility of using the 4FactryManager serious game in a selected educational context, demonstrating its effectiveness in increasing student motivation, improving academic performance and promoting knowledge consolidation. The study and presentation are based on the results of industrial research and development work conducted as part of the project titled (4FM) 4FACTORY Manager – an innovative simulation game for managing real production processes using a novel gameplay model based on the interaction between the virtual and real worlds, applying the Industry 4.0 concept (Project number: POIR.01.02.00-00-0057/19).Keywords: gamification, serious games, education, elearning
Procedia PDF Downloads 1214550 Investigate and Control Thermal Spectra in Nanostructures and 2D Van der Waals Materials
Authors: Joon Sang Kang, Ming Ke, Yongjie Hu
Abstract:
Controlling heat transfer and thermal properties of materials is important to many fields such as energy efficiency and thermal management of integrated circuits. Significant progress over the past decade has been made to improve material performance through structuring at the nanoscale, however a clear relationship between structure dimensions, interfaces, and thermal properties remains to be established. The main challenge comes from the unknown intrinsic spectral contribution from different phonons. Here, we describe our current progress on quantifying and controlling thermal spectra based on our recently developed technical approach using ultrafast optical spectroscopy. Our work brings further the promise of rational material design to achieve high performance through a synergistic experimental-modeling approach. This approach can be broadly applicable to a wide range of materials and energy systems. In particular, we demonstrate in-situ characterization and tunable thermal properties of 2D van der waals materials through ionic intercalations. The significant impacts of this research in improving the efficiency of thermal energy conversion and management will also be illustrated.Keywords: energy, mean free path, nanoscale heat transfer, nanostructure, phonons, TDTR, thermoelectrics, 2D materials
Procedia PDF Downloads 29314549 Fiscal Stability Indicators and Public Debt Trajectory in Croatia
Authors: Hrvoje Simovic
Abstract:
Paper analyses the key problems of fiscal sustainability in Croatia. To point out key challenges of fiscal sustainability, the public debt sustainability is analyzed using standard indicators of fiscal stability, accompanied with the identification of regime changes approach in the public debt trajectory using switching regression approach. The analysis is conducted for the period from 2001 to 2016. Results show huge vulnerability in recession period (2009-14), so key challenges in current fiscal policy and public debt management are recognized in maturity prolongation, interest rates trends, and credit rating expectations.Keywords: fiscal sustainability, public debt, Croatia, budget deficit
Procedia PDF Downloads 26314548 TerraEnhance: High-Resolution Digital Elevation Model Generation using GANs
Authors: Siddharth Sarma, Ayush Majumdar, Nidhi Sabu, Mufaddal Jiruwaala, Shilpa Paygude
Abstract:
Digital Elevation Models (DEMs) are digital representations of the Earth’s topography, which include information about the elevation, slope, aspect, and other terrain attributes. DEMs play a crucial role in various applications, including terrain analysis, urban planning, and environmental modeling. In this paper, TerraEnhance is proposed, a distinct approach for high-resolution DEM generation using Generative Adversarial Networks (GANs) combined with Real-ESRGANs. By learning from a dataset of low-resolution DEMs, the GANs are trained to upscale the data by 10 times, resulting in significantly enhanced DEMs with improved resolution and finer details. The integration of Real-ESRGANs further enhances visual quality, leading to more accurate representations of the terrain. A post-processing layer is introduced, employing high-pass filtering to refine the generated DEMs, preserving important details while reducing noise and artifacts. The results demonstrate that TerraEnhance outperforms existing methods, producing high-fidelity DEMs with intricate terrain features and exceptional accuracy. These advancements make TerraEnhance suitable for various applications, such as terrain analysis and precise environmental modeling.Keywords: DEM, ESRGAN, image upscaling, super resolution, computer vision
Procedia PDF Downloads 1514547 An Inverse Heat Transfer Algorithm for Predicting the Thermal Properties of Tumors during Cryosurgery
Authors: Mohamed Hafid, Marcel Lacroix
Abstract:
This study aimed at developing an inverse heat transfer approach for predicting the time-varying freezing front and the temperature distribution of tumors during cryosurgery. Using a temperature probe pressed against the layer of tumor, the inverse approach is able to predict simultaneously the metabolic heat generation and the blood perfusion rate of the tumor. Once these parameters are predicted, the temperature-field and time-varying freezing fronts are determined with the direct model. The direct model rests on one-dimensional Pennes bioheat equation. The phase change problem is handled with the enthalpy method. The Levenberg-Marquardt Method (LMM) combined to the Broyden Method (BM) is used to solve the inverse model. The effect (a) of the thermal properties of the diseased tissues; (b) of the initial guesses for the unknown thermal properties; (c) of the data capture frequency; and (d) of the noise on the recorded temperatures is examined. It is shown that the proposed inverse approach remains accurate for all the cases investigated.Keywords: cryosurgery, inverse heat transfer, Levenberg-Marquardt method, thermal properties, Pennes model, enthalpy method
Procedia PDF Downloads 20314546 Single Valued Neutrosophic Hesitant Fuzzy Rough Set and Its Application
Authors: K. M. Alsager, N. O. Alshehri
Abstract:
In this paper, we proposed the notion of single valued neutrosophic hesitant fuzzy rough set, by combining single valued neutrosophic hesitant fuzzy set and rough set. The combination of single valued neutrosophic hesitant fuzzy set and rough set is a powerful tool for dealing with uncertainty, granularity and incompleteness of knowledge in information systems. We presented both definition and some basic properties of the proposed model. Finally, we gave a general approach which is applied to a decision making problem in disease diagnoses, and demonstrated the effectiveness of the approach by a numerical example.Keywords: single valued neutrosophic fuzzy set, single valued neutrosophic fuzzy hesitant set, rough set, single valued neutrosophic hesitant fuzzy rough set
Procedia PDF Downloads 28014545 Polarity Classification of Social Media Comments in Turkish
Authors: Migena Ceyhan, Zeynep Orhan, Dimitrios Karras
Abstract:
People in modern societies are continuously sharing their experiences, emotions, and thoughts in different areas of life. The information reaches almost everyone in real-time and can have an important impact in shaping people’s way of living. This phenomenon is very well recognized and advantageously used by the market representatives, trying to earn the most from this means. Given the abundance of information, people and organizations are looking for efficient tools that filter the countless data into important information, ready to analyze. This paper is a modest contribution in this field, describing the process of automatically classifying social media comments in the Turkish language into positive or negative. Once data is gathered and preprocessed, feature sets of selected single words or groups of words are build according to the characteristics of language used in the texts. These features are used later to train, and test a system according to different machine learning algorithms (Naïve Bayes, Sequential Minimal Optimization, J48, and Bayesian Linear Regression). The resultant high accuracies can be important feedback for decision-makers to improve the business strategies accordingly.Keywords: feature selection, machine learning, natural language processing, sentiment analysis, social media reviews
Procedia PDF Downloads 14914544 Data-Driven Strategies for Enhancing Food Security in Vulnerable Regions: A Multi-Dimensional Analysis of Crop Yield Predictions, Supply Chain Optimization, and Food Distribution Networks
Authors: Sulemana Ibrahim
Abstract:
Food security remains a paramount global challenge, with vulnerable regions grappling with issues of hunger and malnutrition. This study embarks on a comprehensive exploration of data-driven strategies aimed at ameliorating food security in such regions. Our research employs a multifaceted approach, integrating data analytics to predict crop yields, optimizing supply chains, and enhancing food distribution networks. The study unfolds as a multi-dimensional analysis, commencing with the development of robust machine learning models harnessing remote sensing data, historical crop yield records, and meteorological data to foresee crop yields. These predictive models, underpinned by convolutional and recurrent neural networks, furnish critical insights into anticipated harvests, empowering proactive measures to confront food insecurity. Subsequently, the research scrutinizes supply chain optimization to address food security challenges, capitalizing on linear programming and network optimization techniques. These strategies intend to mitigate loss and wastage while streamlining the distribution of agricultural produce from field to fork. In conjunction, the study investigates food distribution networks with a particular focus on network efficiency, accessibility, and equitable food resource allocation. Network analysis tools, complemented by data-driven simulation methodologies, unveil opportunities for augmenting the efficacy of these critical lifelines. This study also considers the ethical implications and privacy concerns associated with the extensive use of data in the realm of food security. The proposed methodology outlines guidelines for responsible data acquisition, storage, and usage. The ultimate aspiration of this research is to forge a nexus between data science and food security policy, bestowing actionable insights to mitigate the ordeal of food insecurity. The holistic approach converging data-driven crop yield forecasts, optimized supply chains, and improved distribution networks aspire to revitalize food security in the most vulnerable regions, elevating the quality of life for millions worldwide.Keywords: data-driven strategies, crop yield prediction, supply chain optimization, food distribution networks
Procedia PDF Downloads 6714543 Brand Identity Creation for Thai Halal Brands
Authors: Pibool Waijittragum
Abstract:
The purpose of this paper is to synthesize the research result of brand Identities of Thai Halal brands which related to the way of life for Thai Muslims. The results will be transforming to Thai Halal Brands packaging and label design. The expected benefit is an alternative of marketing strategy for brand building process for Halal products in Thailand. Four elements of marketing strategies which necessary for the brand identity creation is the research framework: consists of Attributes, Benefits, Values and Personality. The research methodology was applied using qualitative and quantitative; 19 marketing experts with dynamic roles in Thai consumer products were interviewed. In addition, a field survey of 122 Thai Muslims selected from 175 Muslim communities in Bangkok was studied. Data analysis will be according to 5 categories of Thai Halal product: 1) Meat 2) Vegetable and Fruits 3) Instant foods and Garnishing ingredient 4) Beverages, Desserts and Snacks 5) Hygienic daily products. The results will explain some suitable approach for brand Identities of Thai Halal brands as are: 1) Benefit approach as the characteristics of the product with its benefit. The brand identity created transform to the packaging design should be clear and display a fresh product 2) Value approach as the value of products that affect to consumers’ perception. The brand identity created transform to the packaging design should be simply look and using a trustful image 3) Personality approach as the reflection of consumers thought. The brand identity created transform to the packaging design should be sincere, enjoyable, merry, flamboyant look and using a humoristic image.Keywords: marketing strategies, brand identity, packaging and label design, Thai Halal products
Procedia PDF Downloads 43914542 Multidimensional Approach to Analyse the Environmental Impacts of Mobility
Authors: Andras Gyorfi, Andras Torma, Adrienn Buruzs
Abstract:
Mobility has been evolved to a determining field of science. The continuously developing segment involves a variety of affected issues such as public and economic sectors. Beside the changes in mobility the state of environment had also changed in the last period. Alternative mobility as a separate category and the idea of its widespread appliance is such a new field that needs to be studied deeper. Alternative mobility implies finding new types of propulsion, using innovative kinds of power and energy resources, revolutionizing the approach to vehicular control. Including new resources and excluding others has such a complex effect which cannot be unequivocally confirmed by today’s scientific achievements. Changes in specific parameters will most likely reduce the environmental impacts, however, the production of new substances or even their subtraction of the system will cause probably energy deficit as well. The aim of this research is to elaborate the environmental impact matrix of alternative mobility and cognize the factors that are yet unknown, analyse them, look for alternative solutions and conclude all the above in a coherent system. In order to this, we analyse it with a method called ‘the system of systems (SoS) method’ to model the effects and the dynamics of the system. A part of the research process is to examine its impacts on the environment, and to decide whether the newly developed versions of alternative mobility are affecting the environmental state. As a final result, a complex approach will be used which can supplement the current scientific studies. By using the SoS approach, we create a framework of reference containing elements in which we examine the interactions as well. In such a way, a flexible and modular model can be established which supports the prioritizing of effects and the deeper analysis of the complex system.Keywords: environment, alternative mobility, complex model, element analysis, multidimensional map
Procedia PDF Downloads 33014541 Effects of External and Internal Focus of Attention in Motor Learning of Children with Cerebral Palsy
Authors: Morteza Pourazar, Fatemeh Mirakhori, Fazlolah Bagherzadeh, Rasool Hemayattalab
Abstract:
The purpose of study was to examine the effects of external and internal focus of attention in the motor learning of children with cerebral palsy. The study involved 30 boys (7 to 12 years old) with CP type 1 who practiced throwing beanbags. The participants were randomly assigned to the internal focus, external focus, and control groups, and performed six blocks of 10-trial with attentional focus reminders during a practice phase and no reminders during retention and transfer tests. Analysis of variance (ANOVA) with repeated measures on the last factor was used. The results show that significant main effects were found for time and group. However, the interaction of time and group was not significant. Retention scores were significantly higher for the external focus group. The external focus group performed better than other groups; however, the internal focus and control groups’ performance did not differ. The study concluded that motor skills in Spastic Hemiparetic Cerebral Palsy (SHCP) children could be enhanced by external attention.Keywords: cerebral palsy, external attention, internal attention, throwing task
Procedia PDF Downloads 318