Search results for: isoconversional methods
11233 Words Spotting in the Images Handwritten Historical Documents
Authors: Issam Ben Jami
Abstract:
Information retrieval in digital libraries is very important because most famous historical documents occupy a significant value. The word spotting in historical documents is a very difficult notion, because automatic recognition of such documents is naturally cursive, it represents a wide variability in the level scale and translation words in the same documents. We first present a system for the automatic recognition, based on the extraction of interest points words from the image model. The extraction phase of the key points is chosen from the representation of the image as a synthetic description of the shape recognition in a multidimensional space. As a result, we use advanced methods that can find and describe interesting points invariant to scale, rotation and lighting which are linked to local configurations of pixels. We test this approach on documents of the 15th century. Our experiments give important results.Keywords: feature matching, historical documents, pattern recognition, word spotting
Procedia PDF Downloads 27411232 Self-Supervised Learning for Hate-Speech Identification
Authors: Shrabani Ghosh
Abstract:
Automatic offensive language detection in social media has become a stirring task in today's NLP. Manual Offensive language detection is tedious and laborious work where automatic methods based on machine learning are only alternatives. Previous works have done sentiment analysis over social media in different ways such as supervised, semi-supervised, and unsupervised manner. Domain adaptation in a semi-supervised way has also been explored in NLP, where the source domain and the target domain are different. In domain adaptation, the source domain usually has a large amount of labeled data, while only a limited amount of labeled data is available in the target domain. Pretrained transformers like BERT, RoBERTa models are fine-tuned to perform text classification in an unsupervised manner to perform further pre-train masked language modeling (MLM) tasks. In previous work, hate speech detection has been explored in Gab.ai, which is a free speech platform described as a platform of extremist in varying degrees in online social media. In domain adaptation process, Twitter data is used as the source domain, and Gab data is used as the target domain. The performance of domain adaptation also depends on the cross-domain similarity. Different distance measure methods such as L2 distance, cosine distance, Maximum Mean Discrepancy (MMD), Fisher Linear Discriminant (FLD), and CORAL have been used to estimate domain similarity. Certainly, in-domain distances are small, and between-domain distances are expected to be large. The previous work finding shows that pretrain masked language model (MLM) fine-tuned with a mixture of posts of source and target domain gives higher accuracy. However, in-domain performance of the hate classifier on Twitter data accuracy is 71.78%, and out-of-domain performance of the hate classifier on Gab data goes down to 56.53%. Recently self-supervised learning got a lot of attention as it is more applicable when labeled data are scarce. Few works have already been explored to apply self-supervised learning on NLP tasks such as sentiment classification. Self-supervised language representation model ALBERTA focuses on modeling inter-sentence coherence and helps downstream tasks with multi-sentence inputs. Self-supervised attention learning approach shows better performance as it exploits extracted context word in the training process. In this work, a self-supervised attention mechanism has been proposed to detect hate speech on Gab.ai. This framework initially classifies the Gab dataset in an attention-based self-supervised manner. On the next step, a semi-supervised classifier trained on the combination of labeled data from the first step and unlabeled data. The performance of the proposed framework will be compared with the results described earlier and also with optimized outcomes obtained from different optimization techniques.Keywords: attention learning, language model, offensive language detection, self-supervised learning
Procedia PDF Downloads 10611231 Austrian Secondary School Teachers’ Perspectives on Character Education and Life Skills: First Quantitative Insights from a Mixed Methods Study
Authors: Evelyn Kropfreiter, Roland Bernhard
Abstract:
There has been an increased interest in school-based whole-child development in the Austrian education system in the last few years. Although there is a consensus among academics that teachers' beliefs are an essential component of their professional competence, there are hardly any studies in the German-speaking world examining teachers' beliefs about school-based character education. To close this gap, we are conducting a mixed methods study combining qualitative interviews and a questionnaire in Austria (doctoral thesis at the University of Salzburg). In this paper, we present preliminary insights into the quantitative strand of the project. In contrast to German-speaking countries, the Anglo-Saxon world has a long tradition of explicit character education in schools. There has been a rising interest in approaches focusing on a neo-Aristotelian form of character education in England. The Jubilee Centre strongly influences the "renaissance" of papers on neo-Aristotelian character education for Character and Virtues, founded in 2012. The quantitative questionnaire study (n = 264) is an online survey of teachers and school principals conducted in four different federal states in spring 2023. Most respondents (n = 264) from lower secondary schools (AHS-Unterstufe and Mittelschule) believe that character education in schools for 10-14-year-olds is more important for society than good exam results. Many teachers state that they consider themselves prepared to promote their students' personal development and life skills through their education and to attend further training courses. However, there are many obstacles in the education system to ensure that a comprehensive education reaches the students. Many teachers state that they consider themselves prepared to promote their students' character strengths and life skills through their education and to attend further training courses. However, there are many obstacles in the education system to ensure that a comprehensive education reaches the students. Among the most cited difficulties, teachers mention the time factor associated with an overcrowded curriculum and a strong focus on performance, which often leaves them needing more time to keep an eye on nurturing the whole person. The fact that character education is not a separate subject, and its implementation needs to be monitored also makes it challenging to implement it in everyday school life. Austrian teachers prioritize moral virtues such as compassion and honesty as character strengths in everyday school life and resilience and commitment in the next place. Our results are like those reported in other studies on teacher's beliefs about character education. They indicate that Austrian teachers want to teach character in their schools but see systemic constraints such as the curriculum, in which personality roles play a subordinate role, and the focus on performance testing in the school system and the associated lack of time as obstacles to fostering more character development in students.Keywords: character education, life skills, teachers' beliefs, virtues
Procedia PDF Downloads 8111230 Assessing Diagnostic and Evaluation Tools for Use in Urban Immunisation Programming: A Critical Narrative Review and Proposed Framework
Authors: Tim Crocker-Buque, Sandra Mounier-Jack, Natasha Howard
Abstract:
Background: Due to both the increasing scale and speed of urbanisation, urban areas in low and middle-income countries (LMICs) host increasingly large populations of under-immunized children, with the additional associated risks of rapid disease transmission in high-density living environments. Multiple interdependent factors are associated with these coverage disparities in urban areas and most evidence comes from relatively few countries, e.g., predominantly India, Kenya, Nigeria, and some from Pakistan, Iran, and Brazil. This study aimed to identify, describe, and assess the main tools used to measure or improve coverage of immunisation services in poor urban areas. Methods: Authors used a qualitative review design, including academic and non-academic literature, to identify tools used to improve coverage of public health interventions in urban areas. Authors selected and extracted sources that provided good examples of specific tools, or categories of tools, used in a context relevant to urban immunization. Diagnostic (e.g., for data collection, analysis, and insight generation) and programme tools (e.g., for investigating or improving ongoing programmes) and interventions (e.g., multi-component or stand-alone with evidence) were selected for inclusion to provide a range of type and availability of relevant tools. These were then prioritised using a decision-analysis framework and a tool selection guide for programme managers developed. Results: Authors reviewed tools used in urban immunisation contexts and tools designed for (i) non-immunization and/or non-health interventions in urban areas, and (ii) immunisation in rural contexts that had relevance for urban areas (e.g., Reaching every District/Child/ Zone). Many approaches combined several tools and methods, which authors categorised as diagnostic, programme, and intervention. The most common diagnostic tools were cross-sectional surveys, key informant interviews, focus group discussions, secondary analysis of routine data, and geographical mapping of outcomes, resources, and services. Programme tools involved multiple stages of data collection, analysis, insight generation, and intervention planning and included guidance documents from WHO (World Health Organisation), UNICEF (United Nations Children's Fund), USAID (United States Agency for International Development), and governments, and articles reporting on diagnostics, interventions, and/or evaluations to improve urban immunisation. Interventions involved service improvement, education, reminder/recall, incentives, outreach, mass-media, or were multi-component. The main gaps in existing tools were an assessment of macro/policy-level factors, exploration of effective immunization communication channels, and measuring in/out-migration. The proposed framework uses a problem tree approach to suggest tools to address five common challenges (i.e. identifying populations, understanding communities, issues with service access and use, improving services, improving coverage) based on context and available data. Conclusion: This study identified many tools relevant to evaluating urban LMIC immunisation programmes, including significant crossover between tools. This was encouraging in terms of supporting the identification of common areas, but problematic as data volumes, instructions, and activities could overwhelm managers and tools are not always suitably applied to suitable contexts. Further research is needed on how best to combine tools and methods to suit local contexts. Authors’ initial framework can be tested and developed further.Keywords: health equity, immunisation, low and middle-income countries, poverty, urban health
Procedia PDF Downloads 13911229 High Capacity Reversible Watermarking through Interpolated Error Shifting
Authors: Hae-Yeoun Lee
Abstract:
Reversible watermarking that not only protects the copyright but also preserve the original quality of the digital content have been intensively studied. In particular, the demand for reversible watermarking has increased. In this paper, we propose a reversible watermarking scheme based on interpolation-error shifting and error precompensation. The intensity of a pixel is interpolated from the intensities of neighbouring pixels, and the difference histogram between the interpolated and the original intensities is obtained and modified to embed the watermark message. By restoring the difference histogram, the embedded watermark is extracted and the original image is recovered by compensating for the interpolation error. The overflow and underflow are prevented by error precompensation. To show the performance of the method, the proposed algorithm is compared with other methods using various test images.Keywords: reversible watermarking, high capacity, high quality, interpolated error shifting, error precompensation
Procedia PDF Downloads 32311228 How To Get Students’ Attentions?: Little Tricks From 15 English Teachers In Labuan
Authors: Suriani Oxley
Abstract:
All teachers aim to conduct a successful and an effective teaching. Teacher will use a variety of teaching techniques and methods to ensure that students achieve the learning objectives but often the teaching and learning processes are interrupted by a number of things such as noisy students, students not paying attention, the students play and so on. Such disturbances must be addressed to ensure that students can concentrate on their learning activities. This qualitative study observed and captured a video of numerous tricks that teachers in Labuan have implemented in helping the students to pay attentions in the classroom. The tricks are such as Name Calling, Non-Verbal Clues, Body Language, Ask Question, Offer Assistance, Echo Clapping, Call and Response & Cues and Clues. All of these tricks are simple but yet interesting language learning strategies that helped students to focus on their learning activities.Keywords: paying attention, observation, tricks, learning strategies, classroom
Procedia PDF Downloads 56611227 A Survey on Ambient Intelligence in Agricultural Technology
Abstract:
Despite the advances made in various new technologies, application of these technologies for agriculture still remains a formidable task, as it involves integration of diverse domains for monitoring the different process involved in agricultural management. Advances in ambient intelligence technology represents one of the most powerful technology for increasing the yield of agricultural crops and to mitigate the impact of water scarcity, climatic change and methods for managing pests, weeds, and diseases. This paper proposes a GPS-assisted, machine to machine solutions that combine information collected by multiple sensors for the automated management of paddy crops. To maintain the economic viability of paddy cultivation, the various techniques used in agriculture are discussed and a novel system which uses ambient intelligence technique is proposed in this paper. The ambient intelligence based agricultural system gives a great scope.Keywords: ambient intelligence, agricultural technology, smart agriculture, precise farming
Procedia PDF Downloads 60611226 Impact of Electronic Guest Relationship Management (e-GRM) on Brand Loyalty: The Case of Croatian Hotels
Authors: Marina Laškarin, Vlado Galičić
Abstract:
Quick adoption of e-business and emerging influence of “Electronic Word of Mouth e-WOM” communication on guests made leading hotel brands successful examples of electronic guest relationship management. Main reasons behind such success are well established procedures in collection, analysis and usage of highly valuable data available on the Internet, generated through some form of e-GRM programme. E-GRM is more than just a technology solution. It’s a system which balance respective guest demands, hotel technological capabilities and organizational culture of employees, discharging the universal approach in guest relations “same for all”. The purpose of this research derives from the necessity of determining the importance of monitoring and applying e-WOM communication as one of the methods used in managing guest relations. This paper analyses and compares different hotelier’s opinions on e-WOM communication.Keywords: brand loyalty, e-WOM communication, GRM programmes, organizational culture
Procedia PDF Downloads 29011225 Ensemble Machine Learning Approach for Estimating Missing Data from CO₂ Time Series
Authors: Atbin Mahabbati, Jason Beringer, Matthias Leopold
Abstract:
To address the global challenges of climate and environmental changes, there is a need for quantifying and reducing uncertainties in environmental data, including observations of carbon, water, and energy. Global eddy covariance flux tower networks (FLUXNET), and their regional counterparts (i.e., OzFlux, AmeriFlux, China Flux, etc.) were established in the late 1990s and early 2000s to address the demand. Despite the capability of eddy covariance in validating process modelling analyses, field surveys and remote sensing assessments, there are some serious concerns regarding the challenges associated with the technique, e.g. data gaps and uncertainties. To address these concerns, this research has developed an ensemble model to fill the data gaps of CO₂ flux to avoid the limitations of using a single algorithm, and therefore, provide less error and decline the uncertainties associated with the gap-filling process. In this study, the data of five towers in the OzFlux Network (Alice Springs Mulga, Calperum, Gingin, Howard Springs and Tumbarumba) during 2013 were used to develop an ensemble machine learning model, using five feedforward neural networks (FFNN) with different structures combined with an eXtreme Gradient Boosting (XGB) algorithm. The former methods, FFNN, provided the primary estimations in the first layer, while the later, XGB, used the outputs of the first layer as its input to provide the final estimations of CO₂ flux. The introduced model showed slight superiority over each single FFNN and the XGB, while each of these two methods was used individually, overall RMSE: 2.64, 2.91, and 3.54 g C m⁻² yr⁻¹ respectively (3.54 provided by the best FFNN). The most significant improvement happened to the estimation of the extreme diurnal values (during midday and sunrise), as well as nocturnal estimations, which is generally considered as one of the most challenging parts of CO₂ flux gap-filling. The towers, as well as seasonality, showed different levels of sensitivity to improvements provided by the ensemble model. For instance, Tumbarumba showed more sensitivity compared to Calperum, where the differences between the Ensemble model on the one hand and the FFNNs and XGB, on the other hand, were the least of all 5 sites. Besides, the performance difference between the ensemble model and its components individually were more significant during the warm season (Jan, Feb, Mar, Oct, Nov, and Dec) compared to the cold season (Apr, May, Jun, Jul, Aug, and Sep) due to the higher amount of photosynthesis of plants, which led to a larger range of CO₂ exchange. In conclusion, the introduced ensemble model slightly improved the accuracy of CO₂ flux gap-filling and robustness of the model. Therefore, using ensemble machine learning models is potentially capable of improving data estimation and regression outcome when it seems to be no more room for improvement while using a single algorithm.Keywords: carbon flux, Eddy covariance, extreme gradient boosting, gap-filling comparison, hybrid model, OzFlux network
Procedia PDF Downloads 14011224 Fluorescence-Based Biosensor for Dopamine Detection Using Quantum Dots
Authors: Sylwia Krawiec, Joanna Cabaj, Karol Malecha
Abstract:
Nowadays, progress in the field of the analytical methods is of great interest for reliable biological research and medical diagnostics. Classical techniques of chemical analysis, despite many advantages, do not permit to obtain immediate results or automatization of measurements. Chemical sensors have displaced the conventional analytical methods - sensors combine precision, sensitivity, fast response and the possibility of continuous-monitoring. Biosensor is a chemical sensor, which except of conventer also possess a biologically active material, which is the basis for the detection of specific chemicals in the sample. Each biosensor device mainly consists of two elements: a sensitive element, where is recognition of receptor-analyte, and a transducer element which receives the signal and converts it into a measurable signal. Through these two elements biosensors can be divided in two categories: due to the recognition element (e.g immunosensor) and due to the transducer (e.g optical sensor). Working of optical sensor is based on measurements of quantitative changes of parameters characterizing light radiation. The most often analyzed parameters include: amplitude (intensity), frequency or polarization. Changes in the optical properties one of the compound which reacts with biological material coated on the sensor is analyzed by a direct method, in an indirect method indicators are used, which changes the optical properties due to the transformation of the testing species. The most commonly used dyes in this method are: small molecules with an aromatic ring, like rhodamine, fluorescent proteins, for example green fluorescent protein (GFP), or nanoparticles such as quantum dots (QDs). Quantum dots have, in comparison with organic dyes, much better photoluminescent properties, better bioavailability and chemical inertness. These are semiconductor nanocrystals size of 2-10 nm. This very limited number of atoms and the ‘nano’-size gives QDs these highly fluorescent properties. Rapid and sensitive detection of dopamine is extremely important in modern medicine. Dopamine is very important neurotransmitter, which mainly occurs in the brain and central nervous system of mammals. Dopamine is responsible for the transmission information of moving through the nervous system and plays an important role in processes of learning or memory. Detection of dopamine is significant for diseases associated with the central nervous system such as Parkinson or schizophrenia. In developed optical biosensor for detection of dopamine, are used graphene quantum dots (GQDs). In such sensor dopamine molecules coats the GQD surface - in result occurs quenching of fluorescence due to Resonance Energy Transfer (FRET). Changes in fluorescence correspond to specific concentrations of the neurotransmitter in tested sample, so it is possible to accurately determine the concentration of dopamine in the sample.Keywords: biosensor, dopamine, fluorescence, quantum dots
Procedia PDF Downloads 36511223 CsPbBr₃@MOF-5-Based Single Drop Microextraction for in-situ Fluorescence Colorimetric Detection of Dechlorination Reaction
Authors: Yanxue Shang, Jingbin Zeng
Abstract:
Chlorobenzene homologues (CBHs) are a category of environmental pollutants that can not be ignored. They can stay in the environment for a long period and are potentially carcinogenic. The traditional degradation method of CBHs is dechlorination followed by sample preparation and analysis. This is not only time-consuming and laborious, but the detection and analysis processes are used in conjunction with large-scale instruments. Therefore, this can not achieve rapid and low-cost detection. Compared with traditional sensing methods, colorimetric sensing is simpler and more convenient. In recent years, chromaticity sensors based on fluorescence have attracted more and more attention. Compared with sensing methods based on changes in fluorescence intensity, changes in color gradients are easier to recognize by the naked eye. Accordingly, this work proposes to use single drop microextraction (SDME) technology to solve the above problems. After the dechlorination reaction was completed, the organic droplet extracts Cl⁻ and realizes fluorescence colorimetric sensing at the same time. This method was integrated sample processing and visual in-situ detection, simplifying the detection process. As a fluorescence colorimetric sensor material, CsPbBr₃ was encapsulated in MOF-5 to construct CsPbBr₃@MOF-5 fluorescence colorimetric composite. Then the fluorescence colorimetric sensor was constructed by dispersing the composite in SDME organic droplets. When the Br⁻ in CsPbBr₃ exchanges with Cl⁻ produced by the dechlorination reactions, it is converted into CsPbCl₃. The fluorescence color of the single droplet of SDME will change from green to blue emission, thereby realizing visual observation. Therein, SDME can enhance the concentration and enrichment of Cl⁻ and instead of sample pretreatment. The fluorescence color change of CsPbBr₃@MOF-5 can replace the detection process of large-scale instruments to achieve real-time rapid detection. Due to the absorption ability of MOF-5, it can not only improve the stability of CsPbBr₃, but induce the adsorption of Cl⁻. Simultaneously, accelerate the exchange of Br- and Cl⁻ in CsPbBr₃ and the detection process of Cl⁻. The absorption process was verified by density functional theory (DFT) calculations. This method exhibits exceptional linearity for Cl⁻ in the range of 10⁻² - 10⁻⁶ M (10000 μM - 1 μM) with a limit of detection of 10⁻⁷ M. Whereafter, the dechlorination reactions of different kinds of CBHs were also carried out with this method, and all had satisfactory detection ability. Also verified the accuracy by gas chromatography (GC), and it was found that the SDME we developed in this work had high credibility. In summary, the in-situ visualization method of dechlorination reaction detection was a combination of sample processing and fluorescence colorimetric sensing. Thus, the strategy researched herein represents a promising method for the visual detection of dechlorination reactions and can be extended for applications in environments, chemical industries, and foods.Keywords: chlorobenzene homologues, colorimetric sensor, metal halide perovskite, metal-organic frameworks, single drop microextraction
Procedia PDF Downloads 14311222 The Biocompatibility and Osteogenic Potential of Experimental Calcium Silicate Based Root Canal Sealer, Capseal
Authors: Seok Woo Chang
Abstract:
Aim: Capseal I and Capseal II are calcium silicate and calcium phosphate based experimental root canal sealer. The aim of this study was to evaluate the biocompatibility and mineralization potential of Capseal I and Capseal II. Materials and Methods: The biocompatibility and mineralization-related gene expression (alkaline phosphatase (ALP), bone sialoprotein (BSP), and osteocalcin (OCN)) of Capseal I and Capseal II were compared using methylthiazol tetrazolium assay and reverse transcription-polymerization chain reaction analysis, respectively. The results were analyzed by Kruskal-Wallis test. P-value of < 0.05 was considered significant. Result: Both Capseal I and Capseal II were favorable in biocompatibility and influenced the messenger RNA expression of ALP and BSP. Conclusion: Within the limitation of this study, Capseal is biocompatible and have mineralization promoting potential, and thus could be a promising root canal sealer.Keywords: biocompatibility, mineralization-related gene expression, Capseal I, Capseal II
Procedia PDF Downloads 27911221 Protein-Thiocyanate Composite as a Sensor for Iron III Cations
Authors: Hosam El-Sayed, Amira Abou El-Kheir, Salwa Mowafi, Marwa Abou Taleb
Abstract:
Two proteinic biopolymers; namely keratin and sericin, were extracted from their respective natural resources by simple appropriate methods. The said proteins were dissolved in the appropriate solvents followed by regeneration in a form of film polyvinyl alcohol. Proteinium thiocyanate (PTC) composite was prepared by reaction of a regenerated film with potassium thiocyanate in acid medium. In another experiment, the said acidified proteins were reacted with potassium thiocyante before dissolution and regeneration in a form of PTC composite. The possibility of using PTC composite for determination of the concentration of iron III ions in domestic as well as industrial water was examined. The concentration of iron III cations in water was determined spectrophotometrically by measuring the intensity of blood red colour of iron III thiocyanate obtained by interaction of PTC with iron III cation in the tested water sample.Keywords: iron III cations, protein, sensor, thiocyanate, water
Procedia PDF Downloads 42911220 Risk Management in Healthcare Sector in Turkey: A Dental Hospital Case Study
Authors: Pırıl Tekin, Rızvan Erol
Abstract:
Risk management has become very important and popular in developing countries in recent years. Especially making patient and employee health and safety issues compulsory in the hospitals, raised the number of studies in Turkey. Also risk management become more important for hospital senior management from clinics to the laboratories. Because quality is really important to be chosen for both patients to consult and employees to prefer to work. And also risk management studies can lead to hospital management team about future works and methods. By this point of view, this study is the risk assessment carried out in the biggest dental hospital in the south part of Turkey. This study was conducted as a research case study, covering two different health care place; A Clinic and A Laboratory. It shows that the problems in this dental hospital and how it can solve all.Keywords: risk management, healthcare, dental hospital, quality management
Procedia PDF Downloads 37711219 Choking among Babies, Toddlers and Children with Special Needs: A Review of Mechanisms, Implications, Incidence, and Recommendations of Professional Prevention Guidelines
Authors: Ella Abaev, Shany Segal, Miri Gabay
Abstract:
Background: Choking is a blockage of airways that prevents efficient breathing and air flow to the lungs. Choking may be partial or full and is an emergency situation. Complete or prolonged choking leads to apnea, lack of oxygen in the tissues of the body and brain, and can cause death. There are three mechanisms of choking: obstruction of internal respiratory tracts by food or object aspiration, any material that blocks or covers external air passages, external pressure on the neck or trapping between objects. Children's airways are narrower than that of adults and therefore the risk of choking is greater, due to the aspiration of food and other foreign bodies into the lungs. In the Child Development Center at Safra Children’s Hospital, Tel Hashomer in Israel are treated infants, toddlers, and children aged 0-18 years with various developmental disabilities. Due to the increase in reports of ‘almost an event’ of choking in the past year and the serious consequences of choking event, it was decided to give an emphasis to the issue. Incidence and methods: The number of reports of ‘almost an event’ or a choking event was examined at the center during the years 2013-2018 and a thorough research work was conducted on the subject in order to build a prevention program. Findings: Between 2013 and 2018 the center reported about ten cases of ‘almost choking events’. In the middle of 2018 alone three cases of ‘almost an event’ were reported. Objective: Providing knowledge leads to awareness raise, change of perception, change in behavior and prevention. The center employs more than 130 staff members from various sectors so that it is the work of multi-professional teams to promote the quality and safety of the treatment. The familiarity of the staff with risk factors, prevention guidelines, identification of choking signs, and treatment are most important and significant in determining the outcome of a choking event. Conclusions and recommendations: After in-depth research work was carried out in cooperation with the Risk Management Unit on the subject of choking, which include a description of the definitions, mechanisms, risk factors, treatment methods and extensive recommendations for prevention (e.g. using treatment and stimulation accessories with standards association stamps and adjustment of the type of food and the way it is served to match to the child's age and the ability to swallow). The expected stages of development and emphasis on the population of children with special needs were taken into account. The research findings will be published by the staff and parents of the patients, professional publications, and lectures and there is an expectation to decrease the number of choking events in the next years.Keywords: children with special needs, choking, educational system, prevention guidelines
Procedia PDF Downloads 17911218 Umbrella Reinforcement Learning – A Tool for Hard Problems
Authors: Egor E. Nuzhin, Nikolay V. Brilliantov
Abstract:
We propose an approach for addressing Reinforcement Learning (RL) problems. It combines the ideas of umbrella sampling, borrowed from Monte Carlo technique of computational physics and chemistry, with optimal control methods, and is realized on the base of neural networks. This results in a powerful algorithm, designed to solve hard RL problems – the problems, with long-time delayed reward, state-traps sticking and a lack of terminal states. It outperforms the prominent algorithms, such as PPO, RND, iLQR and VI, which are among the most efficient for the hard problems. The new algorithm deals with a continuous ensemble of agents and expected return, that includes the ensemble entropy. This results in a quick and efficient search of the optimal policy in terms of ”exploration-exploitation trade-off” in the state-action space.Keywords: umbrella sampling, reinforcement learning, policy gradient, dynamic programming
Procedia PDF Downloads 2211217 Memetic Algorithm for Solving the One-To-One Shortest Path Problem
Authors: Omar Dib, Alexandre Caminada, Marie-Ange Manier
Abstract:
The purpose of this study is to introduce a novel approach to solve the one-to-one shortest path problem. A directed connected graph is assumed in which all edges’ weights are positive. Our method is based on a memetic algorithm in which we combine a genetic algorithm (GA) and a variable neighborhood search method (VNS). We compare our approximate method with two exact algorithms Dijkstra and Integer Programming (IP). We made experimentations using random generated, complete and real graph instances. In most case studies, numerical results show that our method outperforms exact methods with 5% average gap to the optimality. Our algorithm’s average speed is 20-times faster than Dijkstra and more than 1000-times compared to IP. The details of the experimental results are also discussed and presented in the paper.Keywords: shortest path problem, Dijkstra’s algorithm, integer programming, memetic algorithm
Procedia PDF Downloads 46711216 Evaluation of Salt Content in Bread and the Amount Intake by Hypertensive Patients in the Algiers Region
Authors: S.lanasri, A.Boudjerrane, R.Belgherbi, O.Hadjoudj
Abstract:
Introduction: Bread is the most popular food in Algeria. The aim of this study was to examine the consumption of salt from bread by hypertensive patients. Materials and methods: sixty breads were collected from different artisans Algiers bakeries, each sample was mixed in harm distilled water until homogeneous and filtered. Analysis of the salt content was carried out according to the Mohr method titration. We calculated the amount of salt in bread consumed by 100 hypertensive patients using a questionnaire about the average amount of bread per day. Results: The salt content values from bread were 3.4g ± 0.37 NaCl / 100g.The average amount of salt consumed per day by patients from only bread was 3.82 g ± 3.8 with a maximum of 17 g per day. Only 38.18% of patients consume bread without salt even then 95% knew that excess salt intake can complicate hypertension. Conclusion: This study showed that bread is a major contributor to salt intake by Algerian hypertensive patients.Keywords: salt, bread, hypertensive patients, Algiers
Procedia PDF Downloads 15111215 A Similarity/Dissimilarity Measure to Biological Sequence Alignment
Authors: Muhammad A. Khan, Waseem Shahzad
Abstract:
Analysis of protein sequences is carried out for the purpose to discover their structural and ancestry relationship. Sequence similarity determines similar protein structures, similar function, and homology detection. Biological sequences composed of amino acid residues or nucleotides provide significant information through sequence alignment. In this paper, we present a new similarity/dissimilarity measure to sequence alignment based on the primary structure of a protein. The approach finds the distance between the two given sequences using the novel sequence alignment algorithm and a mathematical model. The algorithm runs at a time complexity of O(n²). A distance matrix is generated to construct a phylogenetic tree of different species. The new similarity/dissimilarity measure outperforms other existing methods.Keywords: alignment, distance, homology, mathematical model, phylogenetic tree
Procedia PDF Downloads 17811214 Embracing the Uniqueness and Potential of Each Child: Moving Theory to Practice
Authors: Joy Chadwick
Abstract:
This Study of Teaching and Learning (SoTL) research focused on the experiences of teacher candidates involved in an inclusive education methods course within a four-year direct entry Bachelor of Education program. The placement of this course within the final fourteen-week practicum semester is designed to facilitate deeper theory-practice connections between effective inclusive pedagogical knowledge and the real life of classroom teaching. The course focuses on supporting teacher candidates to understand that effective instruction within an inclusive classroom context must be intentional, responsive, and relational. Diversity is situated not as exceptional but rather as expected. This interpretive qualitative study involved the analysis of twenty-nine teacher candidate reflective journals and six individual teacher candidate semi-structured interviews. The journal entries were completed at the start of the semester and at the end of the semester with the intent of having teacher candidates reflect on their beliefs of what it means to be an effective inclusive educator and how the course and practicum experiences impacted their understanding and approaches to teaching in inclusive classrooms. The semi-structured interviews provided further depth and context to the journal data. The journals and interview transcripts were coded and themed using NVivo software. The findings suggest that instructional frameworks such as universal design for learning (UDL), differentiated instruction (DI), response to intervention (RTI), social emotional learning (SEL), and self-regulation supported teacher candidate’s abilities to meet the needs of their students more effectively. Course content that focused on specific exceptionalities also supported teacher candidates to be proactive rather than reactive when responding to student learning challenges. Teacher candidates also articulated the importance of reframing their perspective about students in challenging moments and that seeing the individual worth of each child was integral to their approach to teaching. A persisting question for teacher educators exists as to what pedagogical knowledge and understanding is most relevant in supporting future teachers to be effective at planning for and embracing the diversity of student needs within classrooms today. This research directs us to consider the critical importance of addressing personal attributes and mindsets of teacher candidates regarding children as well as considering instructional frameworks when designing coursework. Further, the alignment of an inclusive education course during a teaching practicum allows for an iterative approach to learning. The practical application of course concepts while teaching in a practicum allows for a deeper understanding of instructional frameworks, thus enhancing the confidence of teacher candidates. Research findings have implications for teacher education programs as connected to inclusive education methods courses, practicum experiences, and overall teacher education program design.Keywords: inclusion, inclusive education, pre-service teacher education, practicum experiences, teacher education
Procedia PDF Downloads 6811213 Visualizing Class Metrics and Object Calls for Software Systems
Authors: Mohammad Alnabhan, Awni Hammouri, Mustafa Hammad, Anas Al-Badareen, Omamah Al-Thnebat
Abstract:
Software visualization is one of the main techniques used to simplify the presentation of software systems and enhance their understandability. It is used to present the software system in a visual manner using simple, clear and meaningful symbols. This study proposes a new 2D software visualization approach. In this approach, each class is represented by rectangle, the name of the class placed above the rectangle, the size of class (Line of Code) represented by the height of the rectangle. The methods and the attributes are represented by circles and triangles respectively. The relationships among classes correspond to arrows. The proposed visualization approach was evaluated in terms of applicability and efficiency. Results have confirmed successful implementation of the proposed approach, and its ability to provide a simple and effective graphical presentation of extracted software components and properties.Keywords: software visualization, software metrics, calling relationships, 2D graphs
Procedia PDF Downloads 20511212 Valorization of the Algerian Plaster and Dune Sand in the Building Sector
Authors: S. Dorbani, F. Kharchi, F. Salem, K. Arroudj, N. Chioukh
Abstract:
The need for thermal comfort of buildings, with the aim of saving energy, has always generated a big interest during the development of methods, to improve the mode of construction. In the present paper, which is concerned by the valorization of locally abundant materials, mixtures of plaster and dune sand have been studied. To point out the thermal performances of these mixtures, a comparative study has been established between this product and the two materials most commonly used in construction, the concrete and hollow brick. The results showed that optimal mixture is made with 1/3 plaster and 2/3 dune sand. This mortar achieved significant increases in the mechanical strengths, which allow it to be used as a carrier element for buildings, of up to two levels. The element obtained offers an acceptable thermal insulation, with a decrease the outer-wall construction thickness.Keywords: local materials, mortar, plaster, dune sand, compaction, mechanical performance, thermal performance
Procedia PDF Downloads 48411211 Solution of S3 Problem of Deformation Mechanics for a Definite Condition and Resulting Modifications of Important Failure Theories
Authors: Ranajay Bhowmick
Abstract:
Analysis of stresses for an infinitesimal tetrahedron leads to a situation where we obtain a cubic equation consisting of three stress invariants. This cubic equation, when solved for a definite condition, gives the principal stresses directly without requiring any cumbersome and time-consuming trial and error methods or iterative numerical procedures. Since the failure criterion of different materials are generally expressed as functions of principal stresses, an attempt has been made in this study to incorporate the solutions of the cubic equation in the form of principal stresses, obtained for a definite condition, into some of the established failure theories to determine their modified descriptions. It has been observed that the failure theories can be represented using the quadratic stress invariant and the orientation of the principal plane.Keywords: cubic equation, stress invariant, trigonometric, explicit solution, principal stress, failure criterion
Procedia PDF Downloads 13711210 Modification of Newton Method in Two Point Block Backward Differentiation Formulas
Authors: Khairil I. Othman, Nur N. Kamal, Zarina B. Ibrahim
Abstract:
In this paper, we present modified Newton method as a new strategy for improving the efficiency of Two Point Block Backward Differentiation Formulas (BBDF) when solving stiff systems of ordinary differential equations (ODEs). These methods are constructed to produce two approximate solutions simultaneously at each iteration The detailed implementation of the predictor corrector BBDF with PE(CE)2 with modified Newton are discussed. The proposed modification of BBDF is validated through numerical results on some standard problems found in the literature and comparisons are made with the existing Block Backward Differentiation Formula. Numerical results show the advantage of using the new strategy for solving stiff ODEs in improving the accuracy of the solution.Keywords: newton method, two point, block, accuracy
Procedia PDF Downloads 35711209 Vibration-Based Monitoring of Tensioning Stay Cables of an Extradosed Bridge
Authors: Chun-Chung Chen, Bo-Han Lee, Yu-Chi Sung
Abstract:
Monitoring the status of tensioning force of stay cables is a significant issue for the assessment of structural safety of extradosed bridges. Moreover, it is known that there is a high correlation between the existing tension force and the vibration frequencies of cables. This paper presents the characteristic of frequencies of stay cables of a field extradosed bridge by using vibration-based monitoring methods. The vibration frequencies of each stay cables were measured in stages from the beginning to the completion of bridge construction. The result shows that the vibration frequency variation trend of different lengths of cables at each measured stage is different. The observed feature can help the application of the bridge long-term monitoring system and contribute to the assessment of bridge safety.Keywords: vibration-based method, extradosed bridges, bridge health monitoring, bridge stay cables
Procedia PDF Downloads 14811208 Annular Hyperbolic Profile Fins with Variable Thermal Conductivity Using Laplace Adomian Transform and Double Decomposition Methods
Authors: Yinwei Lin, Cha'o-Kuang Chen
Abstract:
In this article, the Laplace Adomian transform method (LADM) and double decomposition method (DDM) are used to solve the annular hyperbolic profile fins with variable thermal conductivity. As the thermal conductivity parameter ε is relatively large, the numerical solution using DDM become incorrect. Moreover, when the terms of DDM are more than seven, the numerical solution using DDM is very complicated. However, the present method can be easily calculated as terms are over seven and has more precisely numerical solutions. As the thermal conductivity parameter ε is relatively large, LADM also has better accuracy than DDM.Keywords: fins, thermal conductivity, Laplace transform, Adomian, nonlinear
Procedia PDF Downloads 33411207 Multi-Agent TeleRobotic Security Control System: Requirements Definitions of Multi-Agent System Using The Behavioral Patterns Analysis (BPA) Approach
Authors: Assem El-Ansary
Abstract:
This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach in developing an Multi-Agent TeleRobotic Security Control System (MTSCS). The event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are the Behavioral Pattern Analysis (BPA) modeling methodology, and the development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.Keywords: analysis, multi-agent, TeleRobotics control, security, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases
Procedia PDF Downloads 43811206 Evaluation of Batch Splitting in the Context of Load Scattering
Authors: S. Wesebaum, S. Willeke
Abstract:
Production companies are faced with an increasingly turbulent business environment, which demands very high production volumes- and delivery date flexibility. If a decoupling by storage stages is not possible (e.g. at a contract manufacturing company) or undesirable from a logistical point of view, load scattering effects the production processes. ‘Load’ characterizes timing and quantity incidence of production orders (e.g. in work content hours) to workstations in the production, which results in specific capacity requirements. Insufficient coordination between load (demand capacity) and capacity supply results in heavy load scattering, which can be described by deviations and uncertainties in the input behavior of a capacity unit. In order to respond to fluctuating loads, companies try to implement consistent and realizable input behavior using the capacity supply available. For example, a uniform and high level of equipment capacity utilization keeps production costs down. In contrast, strong load scattering at workstations leads to performance loss or disproportionately fluctuating WIP, whereby the logistics objectives are affected negatively. Options for reducing load scattering are e.g. shifting the start and end dates of orders, batch splitting and outsourcing of operations or shifting to other workstations. This leads to an adjustment of load to capacity supply, and thus to a reduction of load scattering. If the adaptation of load to capacity cannot be satisfied completely, possibly flexible capacity must be used to ensure that the performance of a workstation does not decrease for a given load. Where the use of flexible capacities normally raises costs, an adjustment of load to capacity supply reduces load scattering and, in consequence, costs. In the literature you mostly find qualitative statements for describing load scattering. Quantitative evaluation methods that describe load mathematically are rare. In this article the authors discuss existing approaches for calculating load scattering and their various disadvantages such as lack of opportunity for normalization. These approaches are the basis for the development of our mathematical quantification approach for describing load scattering that compensates the disadvantages of the current quantification approaches. After presenting our mathematical quantification approach, the method of batch splitting will be described. Batch splitting allows the adaptation of load to capacity to reduce load scattering. After describing the method, it will be explicitly analyzed in the context of the logistic curve theory by Nyhuis using the stretch factor α1 in order to evaluate the impact of the method of batch splitting on load scattering and on logistic curves. The conclusion of this article will be to show how the methods and approaches presented can help companies in a turbulent environment to quantify the occurring work load scattering accurately and apply an efficient method for adjusting work load to capacity supply. In this way, the achievements of the logistical objectives are increased without causing additional costs.Keywords: batch splitting, production logistics, production planning and control, quantification, load scattering
Procedia PDF Downloads 39911205 Digital Cinema Watermarking State of Art and Comparison
Authors: H. Kelkoul, Y. Zaz
Abstract:
Nowadays, the vigorous popularity of video processing techniques has resulted in an explosive growth of multimedia data illegal use. So, watermarking security has received much more attention. The purpose of this paper is to explore some watermarking techniques in order to observe their specificities and select the finest methods to apply in digital cinema domain against movie piracy by creating an invisible watermark that includes the date, time and the place where the hacking was done. We have studied three principal watermarking techniques in the frequency domain: Spread spectrum, Wavelet transform domain and finally the digital cinema watermarking transform domain. In this paper, a detailed technique is presented where embedding is performed using direct sequence spread spectrum technique in DWT transform domain. Experiment results shows that the algorithm provides high robustness and good imperceptibility.Keywords: digital cinema, watermarking, wavelet DWT, spread spectrum, JPEG2000 MPEG4
Procedia PDF Downloads 25111204 Lossless Secret Image Sharing Based on Integer Discrete Cosine Transform
Authors: Li Li, Ahmed A. Abd El-Latif, Aya El-Fatyany, Mohamed Amin
Abstract:
This paper proposes a new secret image sharing method based on integer discrete cosine transform (IntDCT). It first transforms the original image into the frequency domain (DCT coefficients) using IntDCT, which are operated on each block with size 8*8. Then, it generates shares among each DCT coefficients in the same place of each block, that is, all the DC components are used to generate DC shares, the ith AC component in each block are utilized to generate ith AC shares, and so on. The DC and AC shares components with the same number are combined together to generate DCT shadows. Experimental results and analyses show that the proposed method can recover the original image lossless than those methods based on traditional DCT and is more sensitive to tiny change in both the coefficients and the content of the image.Keywords: secret image sharing, integer DCT, lossless recovery, sensitivity
Procedia PDF Downloads 398