Search results for: ArcMap mapping
386 Genetic Characterization of a Composite Transposon Carrying armA and Aac(6)-Ib Genes in an Escherichia coli Isolate from Egypt
Authors: Omneya M. Helmy, Mona T. Kashef
Abstract:
Aminoglycosides are used in treating a wide range of infections caused by both Gram-negative and Gram positive bacteria. The presence of 16S rRNA methyl transferases (16S-RMTase) is among the newly discovered resistance mechanisms that confer high resistance to clinically useful aminoglycosides. Cephalosporins are the most commonly used antimicrobials in Egypt; therefore, this study was conducted to determine the isolation frequency of 16S rRNA methyl transferases among third generation cephalosporin-resistant clinical isolates in Egypt. One hundred and twenty three cephalosporin resistant Gram-negative clinical isolates were screened for aminoglycoside resistance by the Kirby Bauer disk diffusion method and tested for possible production of 16S-RMTase. PCR testing and sequencing were used to confirm the presence of 16S-RMTase and the associated antimicrobial resistance determinants, as well as the genetic region surrounding the armA gene. Out of 123 isolates, 66 (53.66%) were resistant to at least one aminoglycoside antibiotic. Only one Escherichia coli isolate (E9ECMO) which was totally resistant to all tested aminoglycosides, was confirmed to have the armA gene in association with blaTEM-1, blaCTX-M-15, blaCTX-M-14 and aac(6)-Ib genes. The armA gene was found to be carried on a large A/C plasmid. Genetic mapping of the armA surrounding region revealed, for the first time, the association of armA with aac(6)-Ib on the same transposon. In Conclusion, the isolation frequency of 16S-RMTase was low among the tested cephalosporin-resistant clinical samples. However, a novel composite transposon has been detected conferring high-level aminoglycosides resistance.Keywords: aminoglcosides, armA gene, β lactmases, 16S rRNA methyl transferases
Procedia PDF Downloads 282385 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis
Authors: Meng Su
Abstract:
High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis
Procedia PDF Downloads 107384 Digital Twin Smart Hospital: A Guide for Implementation and Improvements
Authors: Enido Fabiano de Ramos, Ieda Kanashiro Makiya, Francisco I. Giocondo Cesar
Abstract:
This study investigates the application of Digital Twins (DT) in Smart Hospital Environments (SHE), through a bibliometric study and literature review, including comparison with the principles of Industry 4.0. It aims to analyze the current state of the implementation of digital twins in clinical and non-clinical operations in healthcare settings, identifying trends and challenges, comparing these practices with Industry 4.0 concepts and technologies, in order to present a basic framework including stages and maturity levels. The bibliometric methodology will allow mapping the existing scientific production on the theme, while the literature review will synthesize and critically analyze the relevant studies, highlighting pertinent methodologies and results, additionally the comparison with Industry 4.0 will provide insights on how the principles of automation, interconnectivity and digitalization can be applied in healthcare environments/operations, aiming at improvements in operational efficiency and quality of care. The results of this study will contribute to a deeper understanding of the potential of Digital Twins in Smart Hospitals, in addition to the future potential from the effective integration of Industry 4.0 concepts in this specific environment, presented through the practical framework, after all, the urgent need for changes addressed in this article is undeniable, as well as all their value contribution to human sustainability, designed in SDG3 – Health and well-being: ensuring that all citizens have a healthy life and well-being, at all ages and in all situations. We know that the validity of these relationships will be constantly discussed, and technology can always change the rules of the game.Keywords: digital twin, smart hospital, healthcare operations, industry 4.0, SDG3, technology
Procedia PDF Downloads 53383 Using Open Source Data and GIS Techniques to Overcome Data Deficiency and Accuracy Issues in the Construction and Validation of Transportation Network: Case of Kinshasa City
Authors: Christian Kapuku, Seung-Young Kho
Abstract:
An accurate representation of the transportation system serving the region is one of the important aspects of transportation modeling. Such representation often requires developing an abstract model of the system elements, which also requires important amount of data, surveys and time. However, in some cases such as in developing countries, data deficiencies, time and budget constraints do not always allow such accurate representation, leaving opportunities to assumptions that may negatively affect the quality of the analysis. With the emergence of Internet open source data especially in the mapping technologies as well as the advances in Geography Information System, opportunities to tackle these issues have raised. Therefore, the objective of this paper is to demonstrate such application through a practical case of the development of the transportation network for the city of Kinshasa. The GIS geo-referencing was used to construct the digitized map of Transportation Analysis Zones using available scanned images. Centroids were then dynamically placed at the center of activities using an activities density map. Next, the road network with its characteristics was built using OpenStreet data and other official road inventory data by intersecting their layers and cleaning up unnecessary links such as residential streets. The accuracy of the final network was then checked, comparing it with satellite images from Google and Bing. For the validation, the final network was exported into Emme3 to check for potential network coding issues. Results show a high accuracy between the built network and satellite images, which can mostly be attributed to the use of open source data.Keywords: geographic information system (GIS), network construction, transportation database, open source data
Procedia PDF Downloads 167382 Evaluation and Possibilities of Valorization of Ecotourism Potentials in the Mbam and Djerem National Park
Authors: Rinyu Shei Mercy
Abstract:
Protected areas are the potential areas for the development of ecotourism because of their biodiversity, landscapes, waterfalls, lakes, caves, salt lick and cultural heritage of local or indigenous people. These potentials have not yet been valorized, so this study will enable to investigate the evaluation and possibilities of valorization of ecotourism potentials in the Mbam and Djerem National Park. Hence, this was done by employing a combination of field observations, examination, data collection and evaluation, using a SWOT analysis. The SWOT provides an analysis to determine the strengths, weaknesses, opportunities and threats, and strategic suggestions for ecological planning. The study helps to determine an ecotouristic inventory and mapping of ecotourism potentials of the park, evaluate the degree of valorization of these potentials and the possibilities of valorization. Finally, the study has proven that the park has much natural potentials such as rivers, salt licks, waterfall and rapids, lakes, caves and rocks, etc. Also, from the study, it was realized that as concerns the degree of valorization of these ecotourism potentials, 50% of the population visit the salt lick of Pkayere because it’s a biodiversity hotspot and rich in mineral salt attracting a lot of animals and the least is the lake Miyere with 1% due to the fact that it is sacred. Moreover, from the results, there are possibilities that these potentials can be valorized and put into use because of their attractive nature such as creating good roads and bridges, good infrastructural facilities, good communication network etc. So, the study recommends that, in this process, MINTOUR, WCS, tour operators must interact sufficiently in order to develop the potential interest to ecotourism, ecocultural tourism and scientific tourism.Keywords: ecotourism, national park Mbam and Djerem, valorization of biodiversity, protected areas of Cameroon
Procedia PDF Downloads 137381 Application of the Urban Forest Credit Standard as a Tool for Compensating CO2 Emissions in the Metalworking Industry: A Case Study in Brazil
Authors: Marie Madeleine Sarzi Inacio, Ligiane Carolina Leite Dauzacker, Rodrigo Henriques Lopes Da Silva
Abstract:
The climate changes resulting from human activity have increased interest in more sustainable production practices to reduce and offset pollutant emissions. Brazil, with its vast areas capable of carbon absorption, holds a significant advantage in this context. However, to optimize the country's sustainable potential, it is important to establish a robust carbon market with clear rules for the eligibility and validation of projects aimed at reducing and offsetting Greenhouse Gas (GHG) emissions. In this study, our objective is to conduct a feasibility analysis through a case study to evaluate the implementation of an urban forest credits standard in Brazil, using the Urban Forest Credits (UFC) model implemented in the United States as a reference. Thus, the city of Ribeirão Preto, located in Brazil, was selected to assess the availability of green areas. With the CO2 emissions value from the metalworking industry, it was possible to analyze information in the case study, considering the activity. The QGIS software was used to map potential urban forest areas, which can connect to various types of geospatial databases. Although the chosen municipality has little vegetative coverage, the mapping identified at least eight areas that fit the standard definitions within the delimited urban perimeter. The outlook was positive, and the implementation of projects like Urban Forest Credits (UFC) adapted to the Brazilian reality has great potential to benefit the country in the carbon market and contribute to achieving its Greenhouse Gas (GHG) emission reduction goals.Keywords: carbon neutrality, metalworking industry, carbon credits, urban forestry credits
Procedia PDF Downloads 76380 Importance-Performance Analysis of Volunteer Tourism in Ethiopia: Host and Guest Case Study
Authors: Zita Fomukong Andam
Abstract:
With a general objective of evaluating the importance and Performance attributes of Volunteer Tourism in Ethiopia and also specifically intending to rank out the importance to evaluate the competitive performance of Ethiopia to host volunteer tourists, laying them in a four quadrant grid and conduct the IPA Iso-Priority Line comparison of Volunteer Tourism in Ethiopia. From hosts and guests point of view, a deeper research discourse was conducted with a randomly selected 384 guests and 165 hosts in Ethiopia. Findings of the discourse through an exploratory research design on both the hosts and the guests confirm that attributes of volunteer tourism generally and marginally fall in the South East quadrant of the matrix where their importance is relatively higher than their performance counterpart, also referred as ‘Concentrate Here’ quadrant. The fact that there are more items in this particular place in both the host and guest study, where they are highly important, but their relative performance is low, strikes a message that the country has more to do. Another focus point of this study is mapping the scores of attributes regarding the guest and host importance and performance against the Iso-Priority Line. Results of Iso-Priority Line Analysis of the IPA of Volunteer Tourism in Ethiopia from the Host’s Perspective showed that there are no attributes where their importance is exactly the same as their performance. With this being found, the fact that this research design inhabits many characters of exploratory nature, it is not confirmed research output. This paper reserves from prescribing anything to the applied world before further confirmatory research is conducted on the issue and rather calls the scientific community to augment this study through comprehensive, exhaustive, extensive and extended works of inquiry in order to get a refined set of recommended items to the applied world.Keywords: volunteer tourism, competitive performance importance-performance analysis, Ethiopian tourism
Procedia PDF Downloads 233379 Mapping a Data Governance Framework to the Continuum of Care in the Active Assisted Living Context
Authors: Gaya Bin Noon, Thoko Hanjahanja-Phiri, Laura Xavier Fadrique, Plinio Pelegrini Morita, Hélène Vaillancourt, Jennifer Teague, Tania Donovska
Abstract:
Active Assisted Living (AAL) refers to systems designed to improve the quality of life, aid in independence, and create healthier lifestyles for care recipients. As the population ages, there is a pressing need for non-intrusive, continuous, adaptable, and reliable health monitoring tools to support aging in place. AAL has great potential to support these efforts with the wide variety of solutions currently available, but insufficient efforts have been made to address concerns arising from the integration of AAL into care. The purpose of this research was to (1) explore the integration of AAL technologies and data into the clinical pathway, and (2) map data access and governance for AAL technology in order to develop standards for use by policy-makers, technology manufacturers, and developers of smart communities for seniors. This was done through four successive research phases: (1) literature search to explore existing work in this area and identify lessons learned; (2) modeling of the continuum of care; (3) adapting a framework for data governance into the AAL context; and (4) interviews with stakeholders to explore the applicability of previous work. Opportunities for standards found in these research phases included a need for greater consistency in language and technology requirements, better role definition regarding who can access and who is responsible for taking action based on the gathered data, and understanding of the privacy-utility tradeoff inherent in using AAL technologies in care settings.Keywords: active assisted living, aging in place, internet of things, standards
Procedia PDF Downloads 131378 Application of Remote Sensing for Monitoring the Impact of Lapindo Mud Sedimentation for Mangrove Ecosystem, Case Study in Sidoarjo, East Java
Authors: Akbar Cahyadhi Pratama Putra, Tantri Utami Widhaningtyas, M. Randy Aswin
Abstract:
Indonesia as an archipelagic nation have very long coastline which have large potential marine resources, one of that is the mangrove ecosystems. Lapindo mudflow disaster in Sidoarjo, East Java requires mudflow flowed into the sea through the river Brantas and Porong. Mud material that transported by river flow is feared dangerous because they contain harmful substances such as heavy metals. This study aims to map the mangrove ecosystem seen from its density and knowing how big the impact of a disaster on the Lapindo mud to mangrove ecosystem and accompanied by efforts to address the mangrove ecosystem that maintained continuity. Mapping coastal mangrove conditions of Sidoarjo was done using remote sensing products that Landsat 7 ETM + images with dry months of recording time in 2002, 2006, 2009, and 2014. The density of mangrove detected using NDVI that uses the band 3 that is the red channel and band 4 that is near IR channel. Image processing was used to produce NDVI using ENVI 5.1 software. NDVI results were used for the detection of mangrove density is 0-1. The development of mangrove ecosystems of both area and density from year to year experienced has a significant increase. Mangrove ecosystems growths are affected by material deposition area of Lapindo mud on Porong and Brantas river estuary, where the silt is growing medium suitable mangrove ecosystem and increasingly growing. Increasing the density caused support by public awareness to prevent heavy metals in the material so that the Lapindo mud mangrove breeding done around the farm.Keywords: archipelagic nation, mangrove, Lapindo mudflow disaster, NDVI
Procedia PDF Downloads 438377 Text Analysis to Support Structuring and Modelling a Public Policy Problem-Outline of an Algorithm to Extract Inferences from Textual Data
Authors: Claudia Ehrentraut, Osama Ibrahim, Hercules Dalianis
Abstract:
Policy making situations are real-world problems that exhibit complexity in that they are composed of many interrelated problems and issues. To be effective, policies must holistically address the complexity of the situation rather than propose solutions to single problems. Formulating and understanding the situation and its complex dynamics, therefore, is a key to finding holistic solutions. Analysis of text based information on the policy problem, using Natural Language Processing (NLP) and Text analysis techniques, can support modelling of public policy problem situations in a more objective way based on domain experts knowledge and scientific evidence. The objective behind this study is to support modelling of public policy problem situations, using text analysis of verbal descriptions of the problem. We propose a formal methodology for analysis of qualitative data from multiple information sources on a policy problem to construct a causal diagram of the problem. The analysis process aims at identifying key variables, linking them by cause-effect relationships and mapping that structure into a graphical representation that is adequate for designing action alternatives, i.e., policy options. This study describes the outline of an algorithm used to automate the initial step of a larger methodological approach, which is so far done manually. In this initial step, inferences about key variables and their interrelationships are extracted from textual data to support a better problem structuring. A small prototype for this step is also presented.Keywords: public policy, problem structuring, qualitative analysis, natural language processing, algorithm, inference extraction
Procedia PDF Downloads 589376 Beyond Informality: Relocation from a Traditional Village 'Mit Oqbah' to Masaken El-Barageel and the Role of ‘Urf in Governing Built Environment, Egypt
Authors: Sarah Eldefrawi, Maike Didero
Abstract:
In Egypt, residents’ urban interventions (colloquially named A’hali’s interventions) are always tackled by government, scholars, and media as an encroachment (taeadiyat), chaotic (a’shwa’i) or informal (gheir mokanan) practices. This paper argues that those interventions cannot be simply described as an encroachment on public space or chaotic behaviour. We claim here that they are relevant to traditional governing methods (‘Urf) that were governing Arab cities for many decades. Through an in-depth field study conducted in a real estate public housing project in the city of Giza called 'Masaken El-Barageel', we traced the urban transformations demonstrated in private and public spaces. To understand those transformations, we used wide-range of qualitative research methods such as semi-guided and informal interviews, observations and mapping of the built environment and the newly added interventions. This study was as well strengthened through the contributions of the author in studying nine sectors emerging by Ahali in six districts in Great Cairo. The results of this study indicate that a culturally and socially sensitive framework has to be related to the individual actions toward the spatial and social structures as well as to culturally transmitted views and meanings connected with 'Urf'. The study could trace three crucial principals in ‘urf that influenced these interventions; the eliminating of harm (Al-Marafiq wa Man’ al-Darar), the appropriation of space (Haqq el-Intefa’) and public interest (maslaha a’ma). Our findings open the discussion for the (il) legitimate of a’hali governing methods in contemporary cities.Keywords: Urf, urban governance, public space, public housing, encroachments, chaotic, Egyptian cities
Procedia PDF Downloads 134375 A Global Perspective on Neuropsychology: The Multicultural Neuropsychological Scale
Authors: Tünde Tifordiána Simonyi, Tímea Harmath-Tánczos
Abstract:
The primary aim of the current research is to present the significance of a multicultural perspective in clinical neuropsychology and to present the test battery of the Multicultural Neuropsychological Scale (MUNS). The method includes the MUNS screening tool that involves stimuli common to most cultures in the world. The test battery measures general cognitive functioning focusing on five cognitive domains (memory, executive function, language, visual construction, and attention) tested with seven subtests that can be utilized within a wide age range (15-89), and lower and higher education participants. It is a scale that is sensitive to mild cognitive impairments. Our study presents the first results with the Hungarian translation of MUNS on a healthy sample. The education range was 4-25 years of schooling. The Hungarian sample was recruited by snowball sampling. Within the investigated population (N=151) the age curve follows an inverted U-shaped curve regarding cognitive performance with a high load on memory. Age, reading fluency, and years of education significantly influenced test scores. The sample was tested twice within a 14-49 days interval to determine test-retest reliability, which is satisfactory. Besides the findings of the study and the introduction of the test battery, the article also highlights its potential benefits for both research and clinical neuropsychological practice. The importance of adapting, validating and standardizing the test in other languages besides the Hungarian language context is also stressed. This test battery could serve as a helpful tool in mapping general cognitive functions in psychiatric and neurological disorders regardless of the cultural background of the patients.Keywords: general cognitive functioning, multicultural, MUNS, neuropsychological test battery
Procedia PDF Downloads 109374 A Literature Review on the Effect of Financial Knowledge toward Corporate Growth: The Important Role of Financial Risk Attitude
Authors: Risna Wijayanti, Sumiati, Hanif Iswari
Abstract:
This study aims to analyze the role of financial risk attitude as a mediation between financial knowledge and business growth. The ability of human resources in managing capital (financial literacy) can be a major milestone for a company's business to grow and build its competitive advantage. This study analyzed the important role of financial risk attitude in bringing about financial knowledge on corporate growth. There have been many discussions arguing that financial knowledge is one of the main abilities of corporate managers in determining the success of managing a company. However, a contrary argument of other scholars also enlightened that financial knowledge did not have a significant influence on corporate growth. This study used literatures' review to analyze whether there is another variable that can mediate the effect of financial knowledge toward corporate growth. Research mapping was conducted to analyze the concept of risk tolerance. This concept was related to people's risk aversion effects when making a decision under risk and the role of financial knowledge on changes in financial income. Understanding and managing risks and investments are complicated, in particular for corporate managers, who are always demanded to maintain their corporate growth. Substantial financial knowledge is extremely needed to identify and take accurate information for corporate financial decision-making. By reviewing several literature, this study hypothesized that financial knowledge of corporate managers would be meaningless without manager's courage to bear risks for taking favorable business opportunities. Therefore, the level of risk aversion from corporate managers will determine corporate action, which is a reflection of corporate-level investment behavior leading to attain corporate success or failure for achieving the company's expected growth rate.Keywords: financial knowledge, financial risk attitude, corporate growth, risk tolerance
Procedia PDF Downloads 129373 Analysis and Mapping of Climate and Spring Yield in Tanahun District, Nepal
Authors: Resham Lal Phuldel
Abstract:
This study based on a bilateral development cooperation project funded by the governments of Nepal and Finland. The first phase of the project has been completed in August 2012 and the phase II started in September 2013 and will end September 2018. The project strengthens the capacity of local governments in 14 districts to deliver services in water supply, sanitation and hygiene in Western development region and in Mid-Western development region of Nepal. In recent days, several spring sources have been dried out or slowly decreasing its yield across the country due to changing character of rainfall, increasing evaporative losses and some other manmade causes such as land use change, infrastructure development work etc. To sustain the hilly communities, the sources have to be able to provide sufficient water to serve the population, either on its own or in conjunction with other sources. Phase III have measured all water sources in Tanahu district in 2004 and sources were located with the GPS. Phase II has repeated the exercise to see changes in the district. 3320 water sources as identified in 2004 and altogether 4223 including new water sources were identified and measured in 2014. Between 2004 and 2014, 50% flow rate (yield) deduction of point sources’ average yield in 10 years is found. Similarly, 21.6% and 34% deductions of average yield were found in spring and stream water sources respectively. The rainfall from 2002 to 2013 shows erratic rainfalls in the district. The monsoon peak month is not consistent and the trend shows the decrease of annual rainfall 16.7 mm/year. Further, the temperature trend between 2002 and 2013 shows warming of + 0.0410C/year.Keywords: climate change, rainfall, source discharge, water sources
Procedia PDF Downloads 282372 Long Term Examination of the Profitability Estimation Focused on Benefits
Authors: Stephan Printz, Kristina Lahl, René Vossen, Sabina Jeschke
Abstract:
Strategic investment decisions are characterized by high innovation potential and long-term effects on the competitiveness of enterprises. Due to the uncertainty and risks involved in this complex decision making process, the need arises for well-structured support activities. A method that considers cost and the long-term added value is the cost-benefit effectiveness estimation. One of those methods is the “profitability estimation focused on benefits – PEFB”-method developed at the Institute of Management Cybernetics at RWTH Aachen University. The method copes with the challenges associated with strategic investment decisions by integrating long-term non-monetary aspects whilst also mapping the chronological sequence of an investment within the organization’s target system. Thus, this method is characterized as a holistic approach for the evaluation of costs and benefits of an investment. This participation-oriented method was applied to business environments in many workshops. The results of the workshops are a library of more than 96 cost aspects, as well as 122 benefit aspects. These aspects are preprocessed and comparatively analyzed with regards to their alignment to a series of risk levels. For the first time, an accumulation and a distribution of cost and benefit aspects regarding their impact and probability of occurrence are given. The results give evidence that the PEFB-method combines precise measures of financial accounting with the incorporation of benefits. Finally, the results constitute the basics for using information technology and data science for decision support when applying within the PEFB-method.Keywords: cost-benefit analysis, multi-criteria decision, profitability estimation focused on benefits, risk and uncertainty analysis
Procedia PDF Downloads 445371 Determination of Optimum Parameters for Thermal Stress Distribution in Composite Plate Containing a Triangular Cutout by Optimization Method
Authors: Mohammad Hossein Bayati Chaleshtari, Hadi Khoramishad
Abstract:
Minimizing the stress concentration around triangular cutout in infinite perforated plates subjected to a uniform heat flux induces thermal stresses is an important consideration in engineering design. Furthermore, understanding the effective parameters on stress concentration and proper selection of these parameters enables the designer to achieve a reliable design. In the analysis of thermal stress, the effective parameters on stress distribution around cutout include fiber angle, flux angle, bluntness and rotation angle of the cutout for orthotropic materials. This paper was tried to examine effect of these parameters on thermal stress analysis of infinite perforated plates with central triangular cutout. In order to achieve the least amount of thermal stress around a triangular cutout using a novel swarm intelligence optimization technique called dragonfly optimizer that inspired by the life method and hunting behavior of dragonfly in nature. In this study, using the two-dimensional thermoelastic theory and based on the Likhnitskiiʼ complex variable technique, the stress analysis of orthotropic infinite plate with a circular cutout under a uniform heat flux was developed to the plate containing a quasi-triangular cutout in thermal steady state condition. To achieve this goal, a conformal mapping function was used to map an infinite plate containing a quasi- triangular cutout into the outside of a unit circle. The plate is under uniform heat flux at infinity and Neumann boundary conditions and thermal-insulated condition at the edge of the cutout were considered.Keywords: infinite perforated plate, complex variable method, thermal stress, optimization method
Procedia PDF Downloads 147370 Evaluation of Railway Network and Service Performance Based on Transportation Sustainability in DKI Jakarta
Authors: Nur Bella Octoria Bella, Ayomi Dita Rarasati
Abstract:
DKI Jakarta is Indonesia's capital city with the 10th highest congestion rate in the world based on the 2019 traffic index. Other than that based on World Air Quality Report in 2019 showed DKI Jakarta's air pollutant concentrate 49.4 µg and the 5th highest air pollutant in the world. In the urban city nowadays, the mobility rate is high enough and the efficiency for sustainability assessment in transport infrastructure development is needed. This efficiency is the important key for sustainable infrastructure development. DKI Jakarta is nowadays in the process of constructing the railway infrastructure to support the transportation system. The problems appearing are the railway infrastructure networks and the service in DKI Jakarta already planned based on sustainability factors or not. Therefore, the aim of this research is to make the evaluation of railways infrastructure networks performance and services in DKI Jakarta regards on the railway sustainability key factors. Further, this evaluation will be used to make the railway sustainability assessment framework and to offer some of the alternative solutions to improve railway transportation sustainability in DKI Jakarta. Firstly a very detailed literature review of papers that have focused on railway sustainability factors and their improvements of railway sustainability, published in the scientific journal in the period 2011 until 2021. Regarding the sustainability factors from the literature review, further, it is used to assess the current condition of railway infrastructure in DKI Jakarta. The evaluation will be using a Likert rate questionnaire and directed to the transportation railway expert and the passenger. Furthermore, the mapping and evaluation rate based on the sustainability factors will be compared to the effect factors using the Analytical Hierarchical Process (AHP). This research offers the network's performance and service rate impact on the sustainability aspect and the passenger willingness for using the rail public transportation in DKI Jakarta.Keywords: transportation sustainability, railway transportation, sustainability, DKI Jakarta
Procedia PDF Downloads 163369 Using Visualization Techniques to Support Common Clinical Tasks in Clinical Documentation
Authors: Jonah Kenei, Elisha Opiyo
Abstract:
Electronic health records, as a repository of patient information, is nowadays the most commonly used technology to record, store and review patient clinical records and perform other clinical tasks. However, the accurate identification and retrieval of relevant information from clinical records is a difficult task due to the unstructured nature of clinical documents, characterized in particular by a lack of clear structure. Therefore, medical practice is facing a challenge thanks to the rapid growth of health information in electronic health records (EHRs), mostly in narrative text form. As a result, it's becoming important to effectively manage the growing amount of data for a single patient. As a result, there is currently a requirement to visualize electronic health records (EHRs) in a way that aids physicians in clinical tasks and medical decision-making. Leveraging text visualization techniques to unstructured clinical narrative texts is a new area of research that aims to provide better information extraction and retrieval to support clinical decision support in scenarios where data generated continues to grow. Clinical datasets in electronic health records (EHR) offer a lot of potential for training accurate statistical models to classify facets of information which can then be used to improve patient care and outcomes. However, in many clinical note datasets, the unstructured nature of clinical texts is a common problem. This paper examines the very issue of getting raw clinical texts and mapping them into meaningful structures that can support healthcare professionals utilizing narrative texts. Our work is the result of a collaborative design process that was aided by empirical data collected through formal usability testing.Keywords: classification, electronic health records, narrative texts, visualization
Procedia PDF Downloads 118368 The Design Method of Artificial Intelligence Learning Picture: A Case Study of DCAI's New Teaching
Authors: Weichen Chang
Abstract:
To create a guided teaching method for AI generative drawing design, this paper develops a set of teaching models for AI generative drawing (DCAI), which combines learning modes such as problem-solving, thematic inquiry, phenomenon-based, task-oriented, and DFC . Through the information security AI picture book learning guided programs and content, the application of participatory action research (PAR) and interview methods to explore the dual knowledge of Context and ChatGPT (DCAI) for AI to guide the development of students' AI learning skills. In the interviews, the students highlighted five main learning outcomes (self-study, critical thinking, knowledge generation, cognitive development, and presentation of work) as well as the challenges of implementing the model. Through the use of DCAI, students will enhance their consensus awareness of generative mapping analysis and group cooperation, and they will have knowledge that can enhance AI capabilities in DCAI inquiry and future life. From this paper, it is found that the conclusions are (1) The good use of DCAI can assist students in exploring the value of their knowledge through the power of stories and finding the meaning of knowledge communication; (2) Analyze the transformation power of the integrity and coherence of the story through the context so as to achieve the tension of ‘starting and ending’; (3) Use ChatGPT to extract inspiration, arrange story compositions, and make prompts that can communicate with people and convey emotions. Therefore, new knowledge construction methods will be one of the effective methods for AI learning in the face of artificial intelligence, providing new thinking and new expressions for interdisciplinary design and design education practice.Keywords: artificial intelligence, task-oriented, contextualization, design education
Procedia PDF Downloads 29367 EcoMush: Mapping Sustainable Mushroom Production in Bangladesh
Authors: A. A. Sadia, A. Emdad, E. Hossain
Abstract:
The increasing importance of mushrooms as a source of nutrition, health benefits, and even potential cancer treatment has raised awareness of the impact of climate-sensitive variables on their cultivation. Factors like temperature, relative humidity, air quality, and substrate composition play pivotal roles in shaping mushroom growth, especially in Bangladesh. Oyster mushrooms, a commonly cultivated variety in this region, are particularly vulnerable to climate fluctuations. This research explores the climatic dynamics affecting oyster mushroom cultivation and, presents an approach to address these challenges and provides tangible solutions to fortify the agro-economy, ensure food security, and promote the sustainability of this crucial food source. Using climate and production data, this study evaluates the performance of three clustering algorithms -KMeans, OPTICS, and BIRCH- based on various quality metrics. While each algorithm demonstrates specific strengths, the findings provide insights into their effectiveness for this specific dataset. The results yield essential information, pinpointing the optimal temperature range of 13°C-22°C, the unfavorable temperature threshold of 28°C and above, and the ideal relative humidity range of 75-85% with the suitable production regions in three different seasons: Kharif-1, 2, and Robi. Additionally, a user-friendly web application is developed to support mushroom farmers in making well-informed decisions about their cultivation practices. This platform offers valuable insights into the most advantageous periods for oyster mushroom farming, with the overarching goal of enhancing the efficiency and profitability of mushroom farming.Keywords: climate variability, mushroom cultivation, clustering techniques, food security, sustainability, web-application
Procedia PDF Downloads 68366 Roof and Road Network Detection through Object Oriented SVM Approach Using Low Density LiDAR and Optical Imagery in Misamis Oriental, Philippines
Authors: Jigg L. Pelayo, Ricardo G. Villar, Einstine M. Opiso
Abstract:
The advances of aerial laser scanning in the Philippines has open-up entire fields of research in remote sensing and machine vision aspire to provide accurate timely information for the government and the public. Rapid mapping of polygonal roads and roof boundaries is one of its utilization offering application to disaster risk reduction, mitigation and development. The study uses low density LiDAR data and high resolution aerial imagery through object-oriented approach considering the theoretical concept of data analysis subjected to machine learning algorithm in minimizing the constraints of feature extraction. Since separating one class from another in distinct regions of a multi-dimensional feature-space, non-trivial computing for fitting distribution were implemented to formulate the learned ideal hyperplane. Generating customized hybrid feature which were then used in improving the classifier findings. Supplemental algorithms for filtering and reshaping object features are develop in the rule set for enhancing the final product. Several advantages in terms of simplicity, applicability, and process transferability is noticeable in the methodology. The algorithm was tested in the different random locations of Misamis Oriental province in the Philippines demonstrating robust performance in the overall accuracy with greater than 89% and potential to semi-automation. The extracted results will become a vital requirement for decision makers, urban planners and even the commercial sector in various assessment processes.Keywords: feature extraction, machine learning, OBIA, remote sensing
Procedia PDF Downloads 361365 Employing Visual Culture to Enhance Initial Adult Maltese Language Acquisition
Authors: Jacqueline Żammit
Abstract:
Recent research indicates that the utilization of right-brain strategies holds significant implications for the acquisition of language skills. Nevertheless, the utilization of visual culture as a means to stimulate these strategies and amplify language retention among adults engaging in second language (L2) learning remains a relatively unexplored area. This investigation delves into the impact of visual culture on activating right-brain processes during the initial stages of language acquisition, particularly in the context of teaching Maltese as a second language (ML2) to adult learners. By employing a qualitative research approach, this study convenes a focus group comprising twenty-seven educators to delve into a range of visual culture techniques integrated within language instruction. The collected data is subjected to thematic analysis using NVivo software. The findings underscore a variety of impactful visual culture techniques, encompassing activities such as drawing, sketching, interactive matching games, orthographic mapping, memory palace strategies, wordless picture books, picture-centered learning methodologies, infographics, Face Memory Game, Spot the Difference, Word Search Puzzles, the Hidden Object Game, educational videos, the Shadow Matching technique, Find the Differences exercises, and color-coded methodologies. These identified techniques hold potential for application within ML2 classes for adult learners. Consequently, this study not only provides insights into optimizing language learning through specific visual culture strategies but also furnishes practical recommendations for enhancing language competencies and skills.Keywords: visual culture, right-brain strategies, second language acquisition, maltese as a second language, visual aids, language-based activities
Procedia PDF Downloads 61364 Recession Rate of Gangotri and Its Tributary Glacier, Garhwal Himalaya, India through Kinematic GPS Survey and Satellite Data
Authors: Harish Bisht, Bahadur Singh Kotlia, Kireet Kumar
Abstract:
In order to reconstruct past retreating rates, total area loss, volume change and shift in snout position were measured through multi-temporal satellite data from 1989 to 2016 and kinematic GPS survey from 2015 to 2016. The results obtained from satellite data indicate that in the last 27 years, Chaturangi glacier snout has retreated 1172.57 ± 38.3 m (average 45.07 ± 4.31 m/year) with a total area and volume loss of 0.626 ± 0.001 sq. Km and 0.139 Km³, respectively. The field measurements through differential global positioning system survey revealed that the annual retreating rate was 22.84 ± 0.05 m/year. The large variations in results derived from both the methods are probably because of higher difference in their accuracy. Snout monitoring of the Gangotri glacier during the ablation season (May to September) in the years 2005 and 2015 reveals that the retreating rate has been comparatively more declined than that shown by the earlier studies. The GPS dataset shows that the average recession rate is 10.26 ± 0.05 m/year. In order to determine the possible causes of decreased retreating rate, a relationship between debris thickness and melt rate was also established by using ablation stakes. The present study concludes that remote sensing method is suitable for large area and long term study, while kinematic GPS is more appropriate for the annual monitoring of retreating rate of glacier snout. The present study also emphasizes on mapping of all the tributary glaciers in order to assess the overall changes in the main glacier system and its health.Keywords: Chaturangi glacier, Gangotri glacier, glacier snout, kinematic global positioning system, retreat rate
Procedia PDF Downloads 145363 Application of Space Technology at Cadestral Level and Land Resources Management with Special Reference to Bhoomi Sena Project of Uttar Pradesh, India
Authors: A. K. Srivastava, Sandeep K. Singh, A. K. Kulshetra
Abstract:
Agriculture is the backbone of developing countries of Asian sub-continent like India. Uttar Pradesh is the most populous and fifth largest State of India. Total population of the state is 19.95 crore, which is 16.49% of the country that is more than that of many other countries of the world. Uttar Pradesh occupies only 7.36% of the total area of India. It is a well-established fact that agriculture has virtually been the lifeline of the State’s economy in the past for long and its predominance is likely to continue for a fairly long time in future. The total geographical area of the state is 242.01 lakh hectares, out of which 120.44 lakh hectares is facing various land degradation problems. This needs to be put under various conservation and reclamation measures at much faster pace in order to enhance agriculture productivity in the State. Keeping in view the above scenario Department of Agriculture, Government of Uttar Pradesh has formulated a multi-purpose project namely Bhoomi Sena for the entire state. The main objective of the project is to improve the land degradation using low cost technology available at village level. The total outlay of the project is Rs. 39643.75 Lakhs for an area of about 226000 ha included in the 12th Five Year Plan (2012-13 to 2016-17). It is expected that the total man days would be 310.60 lakh. An attempt has been made to use the space technology like remote sensing, geographical information system, at cadastral level for the overall management of agriculture engineering work which is required for the treatment of degradation of the land. After integration of thematic maps a proposed action plan map has been prepared for the future work.Keywords: GPS, GIS, remote sensing, topographic survey, cadestral mapping
Procedia PDF Downloads 309362 A Methodology to Integrate Data in the Company Based on the Semantic Standard in the Context of Industry 4.0
Authors: Chang Qin, Daham Mustafa, Abderrahmane Khiat, Pierre Bienert, Paulo Zanini
Abstract:
Nowadays, companies are facing lots of challenges in the process of digital transformation, which can be a complex and costly undertaking. Digital transformation involves the collection and analysis of large amounts of data, which can create challenges around data management and governance. Furthermore, it is also challenged to integrate data from multiple systems and technologies. Although with these pains, companies are still pursuing digitalization because by embracing advanced technologies, companies can improve efficiency, quality, decision-making, and customer experience while also creating different business models and revenue streams. In this paper, the issue that data is stored in data silos with different schema and structures is focused. The conventional approaches to addressing this issue involve utilizing data warehousing, data integration tools, data standardization, and business intelligence tools. However, these approaches primarily focus on the grammar and structure of the data and neglect the importance of semantic modeling and semantic standardization, which are essential for achieving data interoperability. In this session, the challenge of data silos in Industry 4.0 is addressed by developing a semantic modeling approach compliant with Asset Administration Shell (AAS) models as an efficient standard for communication in Industry 4.0. The paper highlights how our approach can facilitate the data mapping process and semantic lifting according to existing industry standards such as ECLASS and other industrial dictionaries. It also incorporates the Asset Administration Shell technology to model and map the company’s data and utilize a knowledge graph for data storage and exploration.Keywords: data interoperability in industry 4.0, digital integration, industrial dictionary, semantic modeling
Procedia PDF Downloads 94361 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record
Authors: Raghavi C. Janaswamy
Abstract:
In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.Keywords: electronic health record, graph neural network, heterogeneous data, prediction
Procedia PDF Downloads 86360 MRCP as a Pre-Operative Tool for Predicting Variant Biliary Anatomy in Living Related Liver Donors
Authors: Awais Ahmed, Atif Rana, Haseeb Zia, Maham Jahangir, Rashed Nazir, Faisal Dar
Abstract:
Purpose: Biliary complications represent the most common cause of morbidity in living related liver donor transplantation and detailed preoperative evaluation of biliary anatomic variants is crucial for safe patient selection and improved surgical outcomes. Purpose of this study is to determine the accuracy of preoperative MRCP in predicting biliary variations when compared to intraoperative cholangiography in living related liver donors. Materials and Methods: From 44 potential donors, 40 consecutive living related liver donors (13 females and 28 males) underwent donor hepatectomy at our centre from April 2012 to August 2013. MRCP and IOC of all patients were retrospectively reviewed separately by two radiologists and a transplant surgeon.MRCP was performed on 1.5 Tesla MR magnets using breath-hold heavily T2 weighted radial slab technique. One patient was excluded due to suboptimal MRCP. The accuracy of MRCP for variant biliary anatomy was calculated. Results: MRCP accurately predicted the biliary anatomy in 38 of 39 cases (97 %). Standard biliary anatomy was predicted by MRCP in 25 (64 %) donors (100% sensitivity). Variant biliary anatomy was noted in 14 (36 %) IOCs of which MRCP predicted precise anatomy of 13 variants (93 % sensitivity). The two most common variations were drainage of the RPSD into the LHD (50%) and the triple confluence of the RASD, RPSD and LHD (21%). Conclusion: MRCP is a sensitive imaging tool for precise pre-operative mapping of biliary variations which is critical to surgical decision making in living related liver transplantation.Keywords: intraoperative cholangiogram, liver transplantation, living related donors, magnetic resonance cholangio-pancreaticogram (MRCP)
Procedia PDF Downloads 397359 Electrochemical APEX for Genotyping MYH7 Gene: A Low Cost Strategy for Minisequencing of Disease Causing Mutations
Authors: Ahmed M. Debela, Mayreli Ortiz , Ciara K. O´Sullivan
Abstract:
The completion of the human genome Project (HGP) has paved the way for mapping the diversity in the overall genome sequence which helps to understand the genetic causes of inherited diseases and susceptibility to drugs or environmental toxins. Arrayed primer extension (APEX) is a microarray based minisequencing strategy for screening disease causing mutations. It is derived from Sanger DNA sequencing and uses fluorescently dideoxynucleotides (ddNTPs) for termination of a growing DNA strand from a primer with its 3´- end designed immediately upstream of a site where single nucleotide polymorphism (SNP) occurs. The use of DNA polymerase offers a very high accuracy and specificity to APEX which in turn happens to be a method of choice for multiplex SNP detection. Coupling the high specificity of this method with the high sensitivity, low cost and compatibility for miniaturization of electrochemical techniques would offer an excellent platform for detection of mutation as well as sequencing of DNA templates. We are developing an electrochemical APEX for the analysis of SNPs found in the MYH7 gene for group of cardiomyopathy patients. ddNTPs were labeled with four different redox active compounds with four distinct potentials. Thiolated oligonucleotide probes were immobilised on gold and glassy carbon substrates which are followed by hybridisation with complementary target DNA just adjacent to the base to be extended by polymerase. Electrochemical interrogation was performed after the incorporation of the redox labelled dedioxynucleotide. The work involved the synthesis and characterisation of the redox labelled ddNTPs, optimisation and characterisation of surface functionalisation strategies and the nucleotide incorporation assays.Keywords: array based primer extension, labelled ddNTPs, electrochemical, mutations
Procedia PDF Downloads 246358 Impact of Map Generalization in Spatial Analysis
Authors: Lin Li, P. G. R. N. I. Pussella
Abstract:
When representing spatial data and their attributes on different types of maps, the scale plays a key role in the process of map generalization. The process is consisted with two main operators such as selection and omission. Once some data were selected, they would undergo of several geometrical changing processes such as elimination, simplification, smoothing, exaggeration, displacement, aggregation and size reduction. As a result of these operations at different levels of data, the geometry of the spatial features such as length, sinuosity, orientation, perimeter and area would be altered. This would be worst in the case of preparation of small scale maps, since the cartographer has not enough space to represent all the features on the map. What the GIS users do is when they wanted to analyze a set of spatial data; they retrieve a data set and does the analysis part without considering very important characteristics such as the scale, the purpose of the map and the degree of generalization. Further, the GIS users use and compare different maps with different degrees of generalization. Sometimes, GIS users are going beyond the scale of the source map using zoom in facility and violate the basic cartographic rule 'it is not suitable to create a larger scale map using a smaller scale map'. In the study, the effect of map generalization for GIS analysis would be discussed as the main objective. It was used three digital maps with different scales such as 1:10000, 1:50000 and 1:250000 which were prepared by the Survey Department of Sri Lanka, the National Mapping Agency of Sri Lanka. It was used common features which were on above three maps and an overlay analysis was done by repeating the data with different combinations. Road data, River data and Land use data sets were used for the study. A simple model, to find the best place for a wild life park, was used to identify the effects. The results show remarkable effects on different degrees of generalization processes. It can see that different locations with different geometries were received as the outputs from this analysis. The study suggests that there should be reasonable methods to overcome this effect. It can be recommended that, as a solution, it would be very reasonable to take all the data sets into a common scale and do the analysis part.Keywords: generalization, GIS, scales, spatial analysis
Procedia PDF Downloads 328357 A Neuroscience-Based Learning Technique: Framework and Application to STEM
Authors: Dante J. Dorantes-González, Aldrin Balsa-Yepes
Abstract:
Existing learning techniques such as problem-based learning, project-based learning, or case study learning are learning techniques that focus mainly on technical details, but give no specific guidelines on learner’s experience and emotional learning aspects such as arousal salience and valence, being emotional states important factors affecting engagement and retention. Some approaches involving emotion in educational settings, such as social and emotional learning, lack neuroscientific rigorousness and use of specific neurobiological mechanisms. On the other hand, neurobiology approaches lack educational applicability. And educational approaches mainly focus on cognitive aspects and disregard conditioning learning. First, authors start explaining the reasons why it is hard to learn thoughtfully, then they use the method of neurobiological mapping to track the main limbic system functions, such as the reward circuit, and its relations with perception, memories, motivations, sympathetic and parasympathetic reactions, and sensations, as well as the brain cortex. The authors conclude explaining the major finding: The mechanisms of nonconscious learning and the triggers that guarantee long-term memory potentiation. Afterward, the educational framework for practical application and the instructors’ guidelines are established. An implementation example in engineering education is given, namely, the study of tuned-mass dampers for earthquake oscillations attenuation in skyscrapers. This work represents an original learning technique based on nonconscious learning mechanisms to enhance long-term memories that complement existing cognitive learning methods.Keywords: emotion, emotion-enhanced memory, learning technique, STEM
Procedia PDF Downloads 91