Search results for: search and rescue
1008 Hamlet as the Predecessor of Existentialism - A Study of Quintessential Expression of Existential Pondering
Authors: Phani Kiran, Prabodha Manas Yarlagadda
Abstract:
This paper attempts to treat Shakespeare’s tragic hero, Hamlet as an existential hero who faces many dilemmas in the process of taking revenge for his father’s murder. Hamlet can be considered as a predecessor of existentialism, and Shakespeare, as a pioneer, focused on some serious existential issues in the play much before they were fully developed in 20th century. Hamlet's internal struggles reflect existential themes such as alienation, despair, and the quest for authenticity. Hamlet’s famous soliloquy, "To be, or not to be," is a quintessential expression of existential ponderings, contemplating the choice between life and death and the uncertainty of what lies beyond. Hamlet grapples with existential questions like the purpose and meaninglessness of life, the nature of morality, the inevitability of death, and the existence of an afterlife. He doubts the authenticity of appearance and the reliability of his own perceptions, highlighting the inherent ambiguity and uncertainty of existence. Overall, "Hamlet" aligns with existential philosophy by exploring the complexities of human existence, the search for meaning, and the individual's struggle to find their place in an inherently uncertain and perplexing world. The character of Hamlet and the play's exploration of existential themes continue to resonate with audiences and provoke contemplation on the nature of life and the human experience.Keywords: to be or not to be, death, dilemmas, illusion and reality
Procedia PDF Downloads 671007 Vibratinal Spectroscopic Identification of Beta-Carotene in Usnic Acid and PAHs as a Potential Martian Analogue
Authors: A. I. Alajtal, H. G. M. Edwards, M. A. Elbagermi
Abstract:
Raman spectroscopy is currently a part of the instrumentation suite of the ESA ExoMars mission for the remote detection of life signatures in the Martian surface and subsurface. Terrestrial analogues of Martian sites have been identified and the biogeological modifications incurred as a result of extremophilic activity have been studied. Analytical instrumentation protocols for the unequivocal detection of biomarkers in suitable geological matrices are critical for future unmanned explorations, including the forthcoming ESA ExoMars mission to search for life on Mars scheduled for 2018 and Raman spectroscopy is currently a part of the Pasteur instrumentation suite of this mission. Here, Raman spectroscopy using 785nm excitation was evaluated for determining various concentrations of beta-carotene in admixture with polyaromatic hydrocarbons and usnic acid have been investigated by Raman microspectrometry to determine the lowest levels detectable in simulation of their potential identification remotely in geobiological conditions in Martian scenarios. Information from this study will be important for the development of a miniaturized Raman instrument for targetting Martian sites where the biosignatures of relict or extant life could remain in the geological record.Keywords: raman spectroscopy, mars-analog, beta-carotene, PAHs
Procedia PDF Downloads 3381006 Determination of the Risks of Heart Attack at the First Stage as Well as Their Control and Resource Planning with the Method of Data Mining
Authors: İbrahi̇m Kara, Seher Arslankaya
Abstract:
Frequently preferred in the field of engineering in particular, data mining has now begun to be used in the field of health as well since the data in the health sector have reached great dimensions. With data mining, it is aimed to reveal models from the great amounts of raw data in agreement with the purpose and to search for the rules and relationships which will enable one to make predictions about the future from the large amount of data set. It helps the decision-maker to find the relationships among the data which form at the stage of decision-making. In this study, it is aimed to determine the risk of heart attack at the first stage, to control it, and to make its resource planning with the method of data mining. Through the early and correct diagnosis of heart attacks, it is aimed to reveal the factors which affect the diseases, to protect health and choose the right treatment methods, to reduce the costs in health expenditures, and to shorten the durations of patients’ stay at hospitals. In this way, the diagnosis and treatment costs of a heart attack will be scrutinized, which will be useful to determine the risk of the disease at the first stage, to control it, and to make its resource planning.Keywords: data mining, decision support systems, heart attack, health sector
Procedia PDF Downloads 3561005 Heuristic Classification of Hydrophone Recordings
Authors: Daniel M. Wolff, Patricia Gray, Rafael de la Parra Venegas
Abstract:
An unsupervised machine listening system is constructed and applied to a dataset of 17,195 30-second marine hydrophone recordings. The system is then heuristically supplemented with anecdotal listening, contextual recording information, and supervised learning techniques to reduce the number of false positives. Features for classification are assembled by extracting the following data from each of the audio files: the spectral centroid, root-mean-squared values for each frequency band of a 10-octave filter bank, and mel-frequency cepstral coefficients in 5-second frames. In this way both time- and frequency-domain information are contained in the features to be passed to a clustering algorithm. Classification is performed using the k-means algorithm and then a k-nearest neighbors search. Different values of k are experimented with, in addition to different combinations of the available feature sets. Hypothesized class labels are 'primarily anthrophony' and 'primarily biophony', where the best class result conforming to the former label has 104 members after heuristic pruning. This demonstrates how a large audio dataset has been made more tractable with machine learning techniques, forming the foundation of a framework designed to acoustically monitor and gauge biological and anthropogenic activity in a marine environment.Keywords: anthrophony, hydrophone, k-means, machine learning
Procedia PDF Downloads 1701004 A Portable Cognitive Tool for Engagement Level and Activity Identification
Authors: Terry Teo, Sun Woh Lye, Yufei Li, Zainuddin Zakaria
Abstract:
Wearable devices such as Electroencephalography (EEG) hold immense potential in the monitoring and assessment of a person’s task engagement. This is especially so in remote or online sites. Research into its use in measuring an individual's cognitive state while performing task activities is therefore expected to increase. Despite the growing number of EEG research into brain functioning activities of a person, key challenges remain in adopting EEG for real-time operations. These include limited portability, long preparation time, high number of channel dimensionality, intrusiveness, as well as level of accuracy in acquiring neurological data. This paper proposes an approach using a 4-6 EEG channels to determine the cognitive states of a subject when undertaking a set of passive and active monitoring tasks of a subject. Air traffic controller (ATC) dynamic-tasks are used as a proxy. The work found that when using the channel reduction and identifier algorithm, good trend adherence of 89.1% can be obtained between a commercially available BCI 14 channel Emotiv EPOC+ EEG headset and that of a carefully selected set of reduced 4-6 channels. The approach can also identify different levels of engagement activities ranging from general monitoring ad hoc and repeated active monitoring activities involving information search, extraction, and memory activities.Keywords: assessment, neurophysiology, monitoring, EEG
Procedia PDF Downloads 751003 A New 3D Shape Descriptor Based on Multi-Resolution and Multi-Block CS-LBP
Authors: Nihad Karim Chowdhury, Mohammad Sanaullah Chowdhury, Muhammed Jamshed Alam Patwary, Rubel Biswas
Abstract:
In content-based 3D shape retrieval system, achieving high search performance has become an important research problem. A challenging aspect of this problem is to find an effective shape descriptor which can discriminate similar shapes adequately. To address this problem, we propose a new shape descriptor for 3D shape models by combining multi-resolution with multi-block center-symmetric local binary pattern operator. Given an arbitrary 3D shape, we first apply pose normalization, and generate a set of multi-viewed 2D rendered images. Second, we apply Gaussian multi-resolution filter to generate several levels of images from each of 2D rendered image. Then, overlapped sub-images are computed for each image level of a multi-resolution image. Our unique multi-block CS-LBP comes next. It allows the center to be composed of m-by-n rectangular pixels, instead of a single pixel. This process is repeated for all the 2D rendered images, derived from both ‘depth-buffer’ and ‘silhouette’ rendering. Finally, we concatenate all the features vectors into one dimensional histogram as our proposed 3D shape descriptor. Through several experiments, we demonstrate that our proposed 3D shape descriptor outperform the previous methods by using a benchmark dataset.Keywords: 3D shape retrieval, 3D shape descriptor, CS-LBP, overlapped sub-images
Procedia PDF Downloads 4451002 Side Effects of Dental Whitening: Published Data from the Literature
Authors: Ilma Robo, Saimir Heta, Emela Dalloshi, Nevila Alliu, Vera Ostreni
Abstract:
The dental whitening process, beyond the fact that it is a mini-invasive dental treatment, has effects on the dental structure, or on the pulp of the tooth, where it is applied. The electronic search was performed using keywords to find articles published within the last 10 years about side effects, assessed as such, of minimally invasive dental bleaching treatment. Methodology: In selected articles, the other aim of the study was to evaluate the side effects of bleaching based on the percentage and type of solution used, where the latter was evaluated on the basic solution used for bleaching. Results: The side effects of bleaching are evaluated in selected articles depending on the method of bleaching application, which means it is carried out with recommended solutions, or with mixtures of alternative solutions or substances based on Internet information. Short conclusion: The dental bleaching process has side effects which have not yet been definitively evaluated, experimentally in large samples of individuals or animals (mice or cattle) to arrive at accurate numerical conclusions. The trend of publications about this topic is increasing in recent years, as long as the trend for aesthetic facial treatments, including dental ones, is increasing.Keywords: teeth whitening, side effects, permanent teeth, formed dental apex
Procedia PDF Downloads 631001 Internet Optimization by Negotiating Traffic Times
Authors: Carlos Gonzalez
Abstract:
This paper describes a system to optimize the use of the internet by clients requiring downloading of videos at peak hours. The system consists of a web server belonging to a provider of video contents, a provider of internet communications and a software application running on a client’s computer. The client using the application software will communicate to the video provider a list of the client’s future video demands. The video provider calculates which videos are going to be more in demand for download in the immediate future, and proceeds to request the internet provider the most optimal hours to do the downloading. The times of the downloading will be sent to the application software, which will use the information of pre-established hours negotiated between the video provider and the internet provider to download those videos. The videos will be saved in a special protected section of the user’s hard disk, which will only be accessed by the application software in the client’s computer. When the client is ready to see a video, the application will search the list of current existent videos in the area of the hard disk; if it does exist, it will use this video directly without the need for internet access. We found that the best way to optimize the download traffic of videos is by negotiation between the internet communication provider and the video content provider.Keywords: internet optimization, video download, future demands, secure storage
Procedia PDF Downloads 1361000 The Use Management of the Knowledge Management and the Information Technologies in the Competitive Strategy of a Self-Propelling Industry
Authors: Guerrero Ramírez Sandra, Ramos Salinas Norma Maricela, Muriel Amezcua Vanesa
Abstract:
This article presents the beginning of a wider study that intends to demonstrate how within organizations of the automotive industry from the city of Querétaro. Knowledge management and technological management are required, as well as people’s initiative and the interaction embedded at the interior of it, with the appropriate environment that facilitates information conversion with wide information technologies management (ITM) range. A company was identified for the pilot study of this research, where descriptive and inferential research information was obtained. The results of the pilot suggest that some respondents did noted entity the knowledge management topic, even if staffs have access to information technology (IT) that serve to enhance access to knowledge (through internet, email, databases, external and internal company personnel, suppliers, customers and competitors) data, this implicates that there are Knowledge Management (KM) problems. The data shows that academically well-prepared organizations normally do not recognize the importance of knowledge in the business, nor in the implementation of it, which at the end is a great influence on how to manage it, so that it should guide the company to greater in sight towards a competitive strategy search, given that the company has an excellent technological infrastructure and KM was not exploited. Cultural diversity is another factor that was observed by the staff.Keywords: Knowledge Management (KM), Technological Knowledge Management (TKM), Technology Information Management (TI), access to knowledge
Procedia PDF Downloads 501999 Energy Efficient Clustering with Adaptive Particle Swarm Optimization
Authors: KumarShashvat, ArshpreetKaur, RajeshKumar, Raman Chadha
Abstract:
Wireless sensor networks have principal characteristic of having restricted energy and with limitation that energy of the nodes cannot be replenished. To increase the lifetime in this scenario WSN route for data transmission is opted such that utilization of energy along the selected route is negligible. For this energy efficient network, dandy infrastructure is needed because it impinges the network lifespan. Clustering is a technique in which nodes are grouped into disjoints and non–overlapping sets. In this technique data is collected at the cluster head. In this paper, Adaptive-PSO algorithm is proposed which forms energy aware clusters by minimizing the cost of locating the cluster head. The main concern is of the suitability of the swarms by adjusting the learning parameters of PSO. Particle Swarm Optimization converges quickly at the beginning stage of the search but during the course of time, it becomes stable and may be trapped in local optima. In suggested network model swarms are given the intelligence of the spiders which makes them capable enough to avoid earlier convergence and also help them to escape from the local optima. Comparison analysis with traditional PSO shows that new algorithm considerably enhances the performance where multi-dimensional functions are taken into consideration.Keywords: Particle Swarm Optimization, adaptive – PSO, comparison between PSO and A-PSO, energy efficient clustering
Procedia PDF Downloads 246998 Support Vector Regression Combined with Different Optimization Algorithms to Predict Global Solar Radiation on Horizontal Surfaces in Algeria
Authors: Laidi Maamar, Achwak Madani, Abdellah El Ahdj Abdellah
Abstract:
The aim of this work is to use Support Vector regression (SVR) combined with dragonfly, firefly, Bee Colony and particle swarm Optimization algorithm to predict global solar radiation on horizontal surfaces in some cities in Algeria. Combining these optimization algorithms with SVR aims principally to enhance accuracy by fine-tuning the parameters, speeding up the convergence of the SVR model, and exploring a larger search space efficiently; these parameters are the regularization parameter (C), kernel parameters, and epsilon parameter. By doing so, the aim is to improve the generalization and predictive accuracy of the SVR model. Overall, the aim is to leverage the strengths of both SVR and optimization algorithms to create a more powerful and effective regression model for various cities and under different climate conditions. Results demonstrate close agreement between predicted and measured data in terms of different metrics. In summary, SVM has proven to be a valuable tool in modeling global solar radiation, offering accurate predictions and demonstrating versatility when combined with other algorithms or used in hybrid forecasting models.Keywords: support vector regression (SVR), optimization algorithms, global solar radiation prediction, hybrid forecasting models
Procedia PDF Downloads 35997 Teaching University Students Lateral Reading to Detect Disinformation and Misinformation
Authors: Diane Prorak, Perri Moreno
Abstract:
University students may have been born in the digital age, but they need to be taught the critical thinking skills to detect misinformation and social media manipulation online. In recent years, librarians have been active in designing instructional methods to help students learn information evaluation skills. At the University of Idaho Library (USA), librarians have developed new teaching methods for these skills. Last academic year, when classes were taught via Zoom, librarians taught these skills to an online session of each first-year rhetoric and composition course. In the Zoom sessions, students were placed in breakout groups where they practiced using an evaluation method known as lateral reading. Online collaborative software was used to give each group an evaluative task and break the task into steps. Groups reported back to the full class. Students learned to look at an information source, then search outside the source to find information about the organization, publisher or author, before evaluating the source itself. Class level pre-and post-test comparison results showed students learned better techniques for evaluation than they knew before instruction.Keywords: critical thinking, information evaluation, information literacy instruction, lateral reading.
Procedia PDF Downloads 181996 High Order Block Implicit Multi-Step (Hobim) Methods for the Solution of Stiff Ordinary Differential Equations
Authors: J. P. Chollom, G. M. Kumleng, S. Longwap
Abstract:
The search for higher order A-stable linear multi-step methods has been the interest of many numerical analysts and has been realized through either higher derivatives of the solution or by inserting additional off step points, supper future points and the likes. These methods are suitable for the solution of stiff differential equations which exhibit characteristics that place a severe restriction on the choice of step size. It becomes necessary that only methods with large regions of absolute stability remain suitable for such equations. In this paper, high order block implicit multi-step methods of the hybrid form up to order twelve have been constructed using the multi-step collocation approach by inserting one or more off step points in the multi-step method. The accuracy and stability properties of the new methods are investigated and are shown to yield A-stable methods, a property desirable of methods suitable for the solution of stiff ODE’s. The new High Order Block Implicit Multistep methods used as block integrators are tested on stiff differential systems and the results reveal that the new methods are efficient and compete favourably with the state of the art Matlab ode23 code.Keywords: block linear multistep methods, high order, implicit, stiff differential equations
Procedia PDF Downloads 358995 Change Detection of Vegetative Areas Using Land Use Land Cover Derived from NDVI of Desert Encroached Areas
Authors: T. Garba, T. O. Quddus, Y. Y. Babanyara, M. A. Modibbo
Abstract:
Desertification is define as the changing of productive land into a desert as the result of ruination of land by man-induced soil erosion, which forces famers in the affected areas to move migrate or encourage into reserved areas in search of a fertile land for their farming activities. This study therefore used remote sensing imageries to determine the level of changes in the vegetative areas. To achieve that Normalized Difference of the Vegetative Index (NDVI), classified imageries and image slicing derived from landsat TM 1986, land sat ETM 1999 and Nigeria sat 1 2007 were used to determine changes in vegetations. From the Classified imageries it was discovered that there a more natural vegetation in classified images of 1986 than that of 1999 and 2007. This finding is also future in the three NDVI imageries, it was discovered that there is increased in high positive pixel value from 0.04 in 1986 to 0.22 in 1999 and to 0.32 in 2007. The figures in the three histogram also indicted that there is increased in vegetative areas from 29.15 Km2 in 1986, to 60.58 Km2 in 1999 and then to 109 Km2 in 2007. The study recommends among other things that there is need to restore natural vegetation through discouraging of farming activities in and around the natural vegetation in the study area.Keywords: vegetative index, classified imageries, change detection, landsat, vegetation
Procedia PDF Downloads 360994 Provision of Afterschool Programs: Understanding the Educational Needs and Outcomes of Newcomer and Refugee Students in Canada
Authors: Edward Shizha, Edward Makwarimba
Abstract:
Newcomer and refugee youth feel excluded in the education system in Canada, and the formal education environment does not fully cater for their learning needs. The objective of this study was to build knowledge and understanding of the educational needs and experiences of these youth in Canada and how available afterschool programs can most effectively support their learning needs and academic outcomes. The Employment and Social Development Canada (ESDC), which funded this research, enables and empowers students to advance their educational experience through targeted investments in services that are delivered by youth-serving organizations outside the formal education system through afterschool initiatives. A literature review and a provincial/territorial internet scan were conducted to determine the availability of services and programs that serve the educational needs and academic outcomes of newcomer youth in 10 provinces and 3 territories in Canada. The goal was to identify intersectional factors (e.g., gender, sexuality, culture, social class, race, etc.) that influence educational outcomes of newcomer/refugee students and to recommend ways the ESDC could complement settlement services to enhance students’ educational success. First, data was collected through a literature search of various databases, including PubMed, Web of Science, Scopus, Google docs, ACADEMIA, and grey literature, including government documents, to inform our analysis. Second, a provincial/territorial internet scan was conducted using a template that was created by ESDC staff with the input of the researchers. The objective of the web-search scan was to identify afterschool programs, projects, and initiatives offered to newcomer/refugee youth by service provider organizations. The method for the scan included both qualitative and quantitative data gathering. Both the literature review and the provincial/territorial scan revealed that there are gender disparities in educational outcomes of newcomer and refugee youth. High school completion rates by gender show that boys are at higher risk of not graduating than girls and that girls are more likely than boys to have at least a high school diploma and more likely to proceed to postsecondary education. Findings from literature reveal that afterschool programs are required for refugee youth who experience mental health challenges and miss out on significant periods of schooling, which affect attendance, participation, and graduation from high school. However, some refugee youth use their resilience and ambition to succeed in their educational outcomes. Another finding showed that some immigrant/refugee students, through ethnic organizations and familial affiliation, maintain aspects of their cultural values, parental expectations and ambitious expectations for their own careers to succeed in both high school and postsecondary education. The study found a significant combination of afterschool programs that include academic support, scholarships, bursaries, homework support, career readiness, internships, mentorship, tutoring, non-clinical counselling, mental health and social well-being support, language skills, volunteering opportunities, community connections, peer networking, culturally relevant services etc. These programs assist newcomer youth to develop self-confidence and prepare for academic success and future career development. The study concluded that advantages of afterschool programs are greatest for youth at risk for poor educational outcomes, such as Latino and Black youth, including 2SLGBTQI+ immigrant youth.Keywords: afterschool programs, educational outcomes, newcomer youth, refugee youth, youth-serving organizations
Procedia PDF Downloads 74993 Cellulose Acetate/Polyacrylic Acid Filled with Nano-Hydroxapatite Composites: Spectroscopic Studies and Search for Biomedical Applications
Authors: E. M. AbdelRazek, G. S. ElBahy, M. A. Allam, A. M. Abdelghany, A. M. Hezma
Abstract:
Polymeric biocomposite of hydroxyapatite/polyacrylic acid were prepared and their thermal and mechanical properties were improved by addition of cellulose acetate. FTIR spectroscopy technique and X-ray diffraction analysis were employed to examine the physical and chemical characteristics of the biocomposites. Scanning electron microscopy shows a uniform distribution of HAp nano-particles through the polymeric matrix of two organic/inorganic composites weight ratios (60/40 and 70/30), at which the material crystallinity reaches a considerable value appropriate for the needed applications were studied and revealed that the HAp nano-particles are uniformly distributed in the polymeric matrix. Kinetic parameters were determined from the weight loss data using non isothermal thermogravimetric analysis (TGA). Also, the main degradation steps were described and discussed. The mechanical properties of composites were evaluated by measuring tensile strength and elastic modulus. The data indicate that the addition of cellulose acetate can make homogeneous composites scaffold significantly resistant to higher stress. Elastic modulus of the composites was also improved by the addition of cellulose acetate, making them more appropriate for bioapplications.Keywords: biocomposite, chemical synthesis, infrared spectroscopy, mechanical properties
Procedia PDF Downloads 457992 An Optimal Algorithm for Finding (R, Q) Policy in a Price-Dependent Order Quantity Inventory System with Soft Budget Constraint
Authors: S. Hamid Mirmohammadi, Shahrazad Tamjidzad
Abstract:
This paper is concerned with the single-item continuous review inventory system in which demand is stochastic and discrete. The budget consumed for purchasing the ordered items is not restricted but it incurs extra cost when exceeding specific value. The unit purchasing price depends on the quantity ordered under the all-units discounts cost structure. In many actual systems, the budget as a resource which is occupied by the purchased items is limited and the system is able to confront the resource shortage by charging more costs. Thus, considering the resource shortage costs as a part of system costs, especially when the amount of resource occupied by the purchased item is influenced by quantity discounts, is well motivated by practical concerns. In this paper, an optimization problem is formulated for finding the optimal (R, Q) policy, when the system is influenced by the budget limitation and a discount pricing simultaneously. Properties of the cost function are investigated and then an algorithm based on a one-dimensional search procedure is proposed for finding an optimal (R, Q) policy which minimizes the expected system costs .Keywords: (R, Q) policy, stochastic demand, backorders, limited resource, quantity discounts
Procedia PDF Downloads 641991 Social Work Profession in a Mirror of the Russian Immigrant Media in Israel
Authors: Natalia Khvorostianov, Nelly Elias
Abstract:
The present study seeks to analyze representation of social work in immigrant media, focusing on the case of online newspapers established by immigrants from the Former Soviet Union (FSU) in Israel. This immigrant population is particularly interesting because social work did not exist as a profession practiced in the USSR and hence most FSU immigrants arrive in Israel without a basic knowledge of the essence of social work, the services it provides and the logic behind its treatment methods. The sample of 37 items was built through a Google search of the Russian online newspapers and portals originated in Israel by using keywords such as “social worker,” “social work services” and the like. All items were analyzed by using qualitative content analysis. Principal analytical categories used for the analysis were: Assessment of social work services (negative, positive, neutral); social workers’ professionalism and effectiveness; goals and motives underlying their activity; cross-cultural contact with immigrants and methods used in working with immigrants. On this basis, four dominant images used to portray Israeli social work services and social workers were identified: Lack of professionalism, cultural gaps between FSU immigrants and Israeli social workers, repressive character of social work services and social workers’ involvement in corruption and crime.Keywords: FSU immigrants, immigrant media, media images, social workers
Procedia PDF Downloads 357990 Utilizing Extended Reality in Disaster Risk Reduction Education: A Scoping Review
Authors: Stefano Scippo, Damiana Luzzi, Stefano Cuomo, Maria Ranieri
Abstract:
Background: In response to the rise in natural disasters linked to climate change, numerous studies on Disaster Risk Reduction Education (DRRE) have emerged since the '90s, mainly using a didactic transmission-based approach. Effective DRRE should align with an interactive, experiential, and participatory educational model, which can be costly and risky. A potential solution is using simulations facilitated by eXtended Reality (XR). Research Question: This study aims to conduct a scoping review to explore educational methodologies that use XR to enhance knowledge among teachers, students, and citizens about environmental risks, natural disasters (including climate-related ones), and their management. Method: A search string of 66 keywords was formulated, spanning three domains: 1) education and target audience, 2) environment and natural hazards, and 3) technologies. On June 21st, 2023, the search string was used across five databases: EBSCOhost, IEEE Xplore, PubMed, Scopus, and Web of Science. After deduplication and removing papers without abstracts, 2,152 abstracts (published between 2013 and 2023) were analyzed and 2,062 papers were excluded, followed by the exclusion of 56 papers after full-text scrutiny. Excluded studies focused on unrelated technologies, non-environmental risks, and lacked educational outcomes or accessible texts. Main Results: The 34 reviewed papers were analyzed for context, risk type, research methodology, learning objectives, XR technology use, outcomes, and educational affordances of XR. Notably, since 2016, there has been a rise in scientific publications, focusing mainly on seismic events (12 studies) and floods (9), with a significant contribution from Asia (18 publications), particularly Japan (7 studies). Methodologically, the studies were categorized into empirical (26) and non-empirical (8). Empirical studies involved user or expert validation of XR tools, while non-empirical studies included systematic reviews and theoretical proposals without experimental validation. Empirical studies were further classified into quantitative, qualitative, or mixed-method approaches. Six qualitative studies involved small groups of users or experts, while 20 quantitative or mixed-method studies used seven different research designs, with most (17) employing a quasi-experimental, one-group post-test design, focusing on XR technology usability over educational effectiveness. Non-experimental studies had methodological limitations, making their results hypothetical and in need of further empirical validation. Educationally, the learning objectives centered on knowledge and skills for surviving natural disaster emergencies. All studies recommended XR technologies for simulations or serious games but did not develop comprehensive educational frameworks around these tools. XR-based tools showed potential superiority over traditional methods in teaching risk and emergency management skills. However, conclusions were more valid in studies with experimental designs; otherwise, they remained hypothetical without empirical evidence. The educational affordances of XR, mainly user engagement, were confirmed by the studies. Authors’ Conclusions: The analyzed literature lacks specific educational frameworks for XR in DRRE, focusing mainly on survival knowledge and skills. There is a need to expand educational approaches to include uncertainty education, developing competencies that encompass knowledge, skills, and attitudes like risk perception.Keywords: disaster risk reduction education, educational technologies, scoping review, XR technologies
Procedia PDF Downloads 24989 System of Quality Automation for Documents (SQAD)
Authors: R. Babi Saraswathi, K. Divya, A. Habeebur Rahman, D. B. Hari Prakash, S. Jayanth, T. Kumar, N. Vijayarangan
Abstract:
Document automation is the design of systems and workflows, assembling repetitive documents to meet the specific business needs. In any organization or institution, documenting employee’s information is very important for both employees as well as management. It shows an individual’s progress to the management. Many documents of the employee are in the form of papers, so it is very difficult to arrange and for future reference we need to spend more time in getting the exact document. Also, it is very tedious to generate reports according to our needs. The process gets even more difficult on getting approvals and hence lacks its security aspects. This project overcomes the above-stated issues. By storing the details in the database and maintaining the e-documents, the automation system reduces the manual work to a large extent. Then the approval process of some important documents can be done in a much-secured manner by using Digital Signature and encryption techniques. Details are maintained in the database and e-documents are stored in specific folders and generation of various kinds of reports is possible. Moreover, an efficient search method is implemented is used in the database. Automation supporting document maintenance in many aspects is useful for minimize data entry, reduce the time spent on proof-reading, avoids duplication, and reduce the risks associated with the manual error, etc.Keywords: e-documents, automation, digital signature, encryption
Procedia PDF Downloads 391988 A Comparison between Underwater Image Enhancement Techniques
Authors: Ouafa Benaida, Abdelhamid Loukil, Adda Ali Pacha
Abstract:
In recent years, the growing interest of scientists in the field of image processing and analysis of underwater images and videos has been strengthened following the emergence of new underwater exploration techniques, such as the emergence of autonomous underwater vehicles and the use of underwater image sensors facilitating the exploration of underwater mineral resources as well as the search for new species of aquatic life by biologists. Indeed, underwater images and videos have several defects and must be preprocessed before their analysis. Underwater landscapes are usually darkened due to the interaction of light with the marine environment: light is absorbed as it travels through deep waters depending on its wavelength. Additionally, light does not follow a linear direction but is scattered due to its interaction with microparticles in water, resulting in low contrast, low brightness, color distortion, and restricted visibility. The improvement of the underwater image is, therefore, more than necessary in order to facilitate its analysis. The research presented in this paper aims to implement and evaluate a set of classical techniques used in the field of improving the quality of underwater images in several color representation spaces. These methods have the particularity of being simple to implement and do not require prior knowledge of the physical model at the origin of the degradation.Keywords: underwater image enhancement, histogram normalization, histogram equalization, contrast limited adaptive histogram equalization, single-scale retinex
Procedia PDF Downloads 89987 Image Inpainting Model with Small-Sample Size Based on Generative Adversary Network and Genetic Algorithm
Authors: Jiawen Wang, Qijun Chen
Abstract:
The performance of most machine-learning methods for image inpainting depends on the quantity and quality of the training samples. However, it is very expensive or even impossible to obtain a great number of training samples in many scenarios. In this paper, an image inpainting model based on a generative adversary network (GAN) is constructed for the cases when the number of training samples is small. Firstly, a feature extraction network (F-net) is incorporated into the GAN network to utilize the available information of the inpainting image. The weighted sum of the extracted feature and the random noise acts as the input to the generative network (G-net). The proposed network can be trained well even when the sample size is very small. Secondly, in the phase of the completion for each damaged image, a genetic algorithm is designed to search an optimized noise input for G-net; based on this optimized input, the parameters of the G-net and F-net are further learned (Once the completion for a certain damaged image ends, the parameters restore to its original values obtained in the training phase) to generate an image patch that not only can fill the missing part of the damaged image smoothly but also has visual semantics.Keywords: image inpainting, generative adversary nets, genetic algorithm, small-sample size
Procedia PDF Downloads 130986 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic
Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi
Abstract:
In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing
Procedia PDF Downloads 299985 A Model for Diagnosis and Prediction of Coronavirus Using Neural Network
Authors: Sajjad Baghernezhad
Abstract:
Meta-heuristic and hybrid algorithms have high adeer in modeling medical problems. In this study, a neural network was used to predict covid-19 among high-risk and low-risk patients. This study was conducted to collect the applied method and its target population consisting of 550 high-risk and low-risk patients from the Kerman University of medical sciences medical center to predict the coronavirus. In this study, the memetic algorithm, which is a combination of a genetic algorithm and a local search algorithm, has been used to update the weights of the neural network and develop the accuracy of the neural network. The initial study showed that the accuracy of the neural network was 88%. After updating the weights, the memetic algorithm increased by 93%. For the proposed model, sensitivity, specificity, positive predictivity value, value/accuracy to 97.4, 92.3, 95.8, 96.2, and 0.918, respectively; for the genetic algorithm model, 87.05, 9.20 7, 89.45, 97.30 and 0.967 and for logistic regression model were 87.40, 95.20, 93.79, 0.87 and 0.916. Based on the findings of this study, neural network models have a lower error rate in the diagnosis of patients based on individual variables and vital signs compared to the regression model. The findings of this study can help planners and health care providers in signing programs and early diagnosis of COVID-19 or Corona.Keywords: COVID-19, decision support technique, neural network, genetic algorithm, memetic algorithm
Procedia PDF Downloads 66984 A Metaheuristic Approach for Optimizing Perishable Goods Distribution
Authors: Bahare Askarian, Suchithra Rajendran
Abstract:
Maintaining the freshness and quality of perishable goods during distribution is a critical challenge for logistics companies. This study presents a comprehensive framework aimed at optimizing the distribution of perishable goods through a mathematical model of the Transportation Inventory Location Routing Problem (TILRP). The model incorporates the impact of product age on customer demand, addressing the complexities associated with inventory management and routing. To tackle this problem, we develop both simple and hybrid metaheuristic algorithms designed for small- and medium-scale scenarios. The hybrid algorithm combines Biogeographical Based Optimization (BBO) algorithms with local search techniques to enhance performance in small- and medium-scale scenarios, extending our approach to larger-scale challenges. Through extensive numerical simulations and sensitivity analyses across various scenarios, the performance of the proposed algorithms is evaluated, assessing their effectiveness in achieving optimal solutions. The results demonstrate that our algorithms significantly enhance distribution efficiency, offering valuable insights for logistics companies striving to improve their perishable goods supply chains.Keywords: perishable goods, meta-heuristic algorithm, vehicle problem, inventory models
Procedia PDF Downloads 19983 Isolation and Chemical Characterization of Residual Lignin from Areca Nut Shells
Authors: Dipti Yadav, Latha Rangan, Pinakeswar Mahanta
Abstract:
Recent fuel-development strategies to reduce oil dependency, mitigate greenhouse gas emissions, and utilize domestic resources have generated interest in the search for alternative sources of fuel supplies. Bioenergy production from lignocellulosic biomass has a great potential. Cellulose, hemicellulose and Lignin are main constituent of woods or agrowaste. In all the industries there are always left over or waste products mainly lignin, due to the heterogeneous nature of wood and pulp fibers and the heterogeneity that exists between individual fibers, no method is currently available for the quantitative isolation of native or residual lignin without the risk of structural changes during the isolation. The potential benefits from finding alternative uses of lignin are extensive, and with a double effect. Lignin can be used to replace fossil-based raw materials in a wide range of products, from plastics to individual chemical products, activated carbon, motor fuels and carbon fibers. Furthermore, if there is a market for lignin for such value-added products, the mills will also have an additional economic incentive to take measures for higher energy efficiency. In this study residual lignin were isolated from areca nut shells by acid hydrolysis and were analyzed and characterized by Fourier Transform Infrared (FTIR), LCMS and complexity of its structure investigated by NMR.Keywords: Areca nut, Lignin, wood, bioenergy
Procedia PDF Downloads 474982 Data Structure Learning Platform to Aid in Higher Education IT Courses (DSLEP)
Authors: Estevan B. Costa, Armando M. Toda, Marcell A. A. Mesquita, Jacques D. Brancher
Abstract:
The advances in technology in the last five years allowed an improvement in the educational area, as the increasing in the development of educational software. One of the techniques that emerged in this lapse is called Gamification, which is the utilization of video game mechanics outside its bounds. Recent studies involving this technique provided positive results in the application of these concepts in many areas as marketing, health and education. In the last area there are studies that cover from elementary to higher education, with many variations to adequate to the educators methodologies. Among higher education, focusing on IT courses, data structures are an important subject taught in many of these courses, as they are base for many systems. Based on the exposed this paper exposes the development of an interactive web learning environment, called DSLEP (Data Structure Learning Platform), to aid students in higher education IT courses. The system includes basic concepts seen on this subject such as stacks, queues, lists, arrays, trees and was implemented to ease the insertion of new structures. It was also implemented with gamification concepts, such as points, levels, and leader boards, to engage students in the search for knowledge and stimulate self-learning.Keywords: gamification, Interactive learning environment, data structures, e-learning
Procedia PDF Downloads 494981 Gariep Dam Basin Management for Satisfying Ecological Flow Requirements
Authors: Dimeji Abe, Nonso Okoye, Gideon Ikpimi, Prince Idemudia
Abstract:
Multi-reservoir optimization operation has been a critical issue for river basin management. Water, as a scarce resource, is in high demand and the problems associated with the reservoir as its storage facility are enormous. The complexity in balancing the supply and demand of this prime resource has created the need to examine the best way to solve the problem using optimization techniques. The objective of this study is to evaluate the performance of the multi-objective meta-heuristic algorithm for the operation of Gariep Dam for satisfying ecological flow requirements. This study uses an evolutionary algorithm called backtrack search algorithm (BSA) to determine the best way to optimise the dam operations of hydropower production, flood control, and water supply without affecting the environmental flow requirement for the survival of aquatic bodies and sustain life downstream of the dam. To achieve this objective, the operations of the dam that corresponds to different tradeoffs between the objectives are optimized. The results indicate the best model from the algorithm that satisfies all the objectives without any constraint violation. It is expected that hydropower generation will be improved and more water will be available for ecological flow requirements with the use of the algorithm. This algorithm also provides farmers with more irrigation water as well to improve their business.Keywords: BSA evolutionary algorithm, metaheuristics, optimization, river basin management
Procedia PDF Downloads 245980 Comparison of Sensitivity and Specificity of Pap Smear and Polymerase Chain Reaction Methods for Detection of Human Papillomavirus: A Review of Literature
Authors: M. Malekian, M. E. Heydari, M. Irani Estyar
Abstract:
Human papillomavirus (HPV) is one of the most common sexually transmitted infection, which may lead to cervical cancer as the main cause of it. With early diagnosis and treatment in health care services, cervical cancer and its complications are considered to be preventable. This study was aimed to compare the efficiency, sensitivity, and specificity of Pap smear and polymerase chain reaction (PCR) in detecting HPV. A literature search was performed in Google Scholar, PubMed and SID databases using the keywords 'human papillomavirus', 'pap smear' and 'polymerase change reaction' to identify studies comparing Pap smear and PCR methods for the detection. No restrictions were considered.10 studies were included in this review. All samples that were positive by pop smear were also positive by PCR. However, there were positive samples detected by PCR which was negative by pop smear and in all studies, many positive samples were missed by pop smear technique. Although The Pap smear had high specificity, PCR based HPV detection was more sensitive method and had the highest sensitivity. In order to promote the quality of detection and high achievement of the maximum results, PCR diagnostic methods in addition to the Pap smear are needed and Pap smear method should be combined with PCR techniques according to the high error rate of Pap smear in detection.Keywords: human papillomavirus, cervical cancer, pap smear, polymerase chain reaction
Procedia PDF Downloads 131979 Detecting Port Maritime Communities in Spain with Complex Network Analysis
Authors: Nicanor Garcia Alvarez, Belarmino Adenso-Diaz, Laura Calzada Infante
Abstract:
In recent years, researchers have shown an interest in modelling maritime traffic as a complex network. In this paper, we propose a bipartite weighted network to model maritime traffic and detect port maritime communities. The bipartite weighted network considers two different types of nodes. The first one represents Spanish ports, while the second one represents the countries with which there is major import/export activity. The flow among both types of nodes is modeled by weighting the volume of product transported. To illustrate the model, the data is segmented by each type of traffic. This will allow fine tuning and the creation of communities for each type of traffic and therefore finding similar ports for a specific type of traffic, which will provide decision-makers with tools to search for alliances or identify their competitors. The traffic with the greatest impact on the Spanish gross domestic product is selected, and the evolution of the communities formed by the most important ports and their differences between 2019 and 2009 will be analyzed. Finally, the set of communities formed by the ports of the Spanish port system will be inspected to determine global similarities between them, analyzing the sum of the membership of the different ports in communities formed for each type of traffic in particular.Keywords: bipartite networks, competition, infomap, maritime traffic, port communities
Procedia PDF Downloads 148