Search results for: insurance research database
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24968

Search results for: insurance research database

24728 Alphabet Recognition Using Pixel Probability Distribution

Authors: Vaidehi Murarka, Sneha Mehta, Dishant Upadhyay

Abstract:

Our project topic is “Alphabet Recognition using pixel probability distribution”. The project uses techniques of Image Processing and Machine Learning in Computer Vision. Alphabet recognition is the mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text. It is widely used to convert books and documents into electronic files etc. Alphabet Recognition based OCR application is sometimes used in signature recognition which is used in bank and other high security buildings. One of the popular mobile applications includes reading a visiting card and directly storing it to the contacts. OCR's are known to be used in radar systems for reading speeders license plates and lots of other things. The implementation of our project has been done using Visual Studio and Open CV (Open Source Computer Vision). Our algorithm is based on Neural Networks (machine learning). The project was implemented in three modules: (1) Training: This module aims “Database Generation”. Database was generated using two methods: (a) Run-time generation included database generation at compilation time using inbuilt fonts of OpenCV library. Human intervention is not necessary for generating this database. (b) Contour–detection: ‘jpeg’ template containing different fonts of an alphabet is converted to the weighted matrix using specialized functions (contour detection and blob detection) of OpenCV. The main advantage of this type of database generation is that the algorithm becomes self-learning and the final database requires little memory to be stored (119kb precisely). (2) Preprocessing: Input image is pre-processed using image processing concepts such as adaptive thresholding, binarizing, dilating etc. and is made ready for segmentation. “Segmentation” includes extraction of lines, words, and letters from the processed text image. (3) Testing and prediction: The extracted letters are classified and predicted using the neural networks algorithm. The algorithm recognizes an alphabet based on certain mathematical parameters calculated using the database and weight matrix of the segmented image.

Keywords: contour-detection, neural networks, pre-processing, recognition coefficient, runtime-template generation, segmentation, weight matrix

Procedia PDF Downloads 357
24727 Development of a Data-Driven Method for Diagnosing the State of Health of Battery Cells, Based on the Use of an Electrochemical Aging Model, with a View to Their Use in Second Life

Authors: Desplanches Maxime

Abstract:

Accurate estimation of the remaining useful life of lithium-ion batteries for electronic devices is crucial. Data-driven methodologies encounter challenges related to data volume and acquisition protocols, particularly in capturing a comprehensive range of aging indicators. To address these limitations, we propose a hybrid approach that integrates an electrochemical model with state-of-the-art data analysis techniques, yielding a comprehensive database. Our methodology involves infusing an aging phenomenon into a Newman model, leading to the creation of an extensive database capturing various aging states based on non-destructive parameters. This database serves as a robust foundation for subsequent analysis. Leveraging advanced data analysis techniques, notably principal component analysis and t-Distributed Stochastic Neighbor Embedding, we extract pivotal information from the data. This information is harnessed to construct a regression function using either random forest or support vector machine algorithms. The resulting predictor demonstrates a 5% error margin in estimating remaining battery life, providing actionable insights for optimizing usage. Furthermore, the database was built from the Newman model calibrated for aging and performance using data from a European project called Teesmat. The model was then initialized numerous times with different aging values, for instance, with varying thicknesses of SEI (Solid Electrolyte Interphase). This comprehensive approach ensures a thorough exploration of battery aging dynamics, enhancing the accuracy and reliability of our predictive model. Of particular importance is our reliance on the database generated through the integration of the electrochemical model. This database serves as a crucial asset in advancing our understanding of aging states. Beyond its capability for precise remaining life predictions, this database-driven approach offers valuable insights for optimizing battery usage and adapting the predictor to various scenarios. This underscores the practical significance of our method in facilitating better decision-making regarding lithium-ion battery management.

Keywords: Li-ion battery, aging, diagnostics, data analysis, prediction, machine learning, electrochemical model, regression

Procedia PDF Downloads 33
24726 CE Method for Development of Japan's Stochastic Earthquake Catalogue

Authors: Babak Kamrani, Nozar Kishi

Abstract:

Stochastic catalog represents the events module of the earthquake loss estimation models. It includes series of events with different magnitudes and corresponding frequencies/probabilities. For the development of the stochastic catalog, random or uniform sampling methods are used to sample the events from the seismicity model. For covering all the Magnitude Frequency Distribution (MFD), a huge number of events should be generated for the above-mentioned methods. Characteristic Event (CE) method chooses the events based on the interest of the insurance industry. We divide the MFD of each source into bins. We have chosen the bins based on the probability of the interest by the insurance industry. First, we have collected the information for the available seismic sources. Sources are divided into Fault sources, subduction, and events without specific fault source. We have developed the MFD for each of the individual and areal source based on the seismicity of the sources. Afterward, we have calculated the CE magnitudes based on the desired probability. To develop the stochastic catalog, we have introduced uncertainty to the location of the events too.

Keywords: stochastic catalogue, earthquake loss, uncertainty, characteristic event

Procedia PDF Downloads 268
24725 A Study on the Assessment of Prosthetic Infection after Total Knee Replacement Surgery

Authors: Chun-Lang Chang, Chun-Kai Liu

Abstract:

In this study, the patients that have undergone total knee replacement surgery from the 2010 National Health Insurance database were adopted as the study participants. The important factors were screened and selected through literature collection and interviews with physicians. Through the Cross Entropy Method (CE), Genetic Algorithm Logistic Regression (GALR), and Particle Swarm Optimization (PSO), the weights of the factors were obtained. In addition, the weights of the respective algorithms, coupled with the Excel VBA were adopted to construct the Case Based Reasoning (CBR) system. The results through statistical tests show that the GALR and PSO produced no significant differences, and the accuracy of both models were above 97%. Moreover, the area under the curve of ROC for these two models also exceeded 0.87. This study shall serve as a reference for medical staff as an assistance for clinical assessment of infections in order to effectively enhance medical service quality and efficiency, avoid unnecessary medical waste, and substantially contribute to resource allocations in medical institutions.

Keywords: Case Based Reasoning, Cross Entropy Method, Genetic Algorithm Logistic Regression, Particle Swarm Optimization, Total Knee Replacement Surgery

Procedia PDF Downloads 296
24724 Bibliometric Analysis of the Research Progress on Graphene Inks from 2008 to 2018

Authors: Jean C. A. Sousa, Julio Cesar Maciel Santos, Andressa J. Rubio, Edneia A. S. Paccola, Natália U. Yamaguchi

Abstract:

A bibliometric analysis in the Web of Science database was used to identify overall scientific results of graphene inks to date (2008 to 2018). The objective of this study was to evaluate the evolutionary tendency of graphene inks research and to identify its aspects, aiming to provide data that can guide future work. The contributions of different researches, languages, thematic categories, periodicals, place of publication, institutes, funding agencies, articles cited and applications were analyzed. The results revealed a growing number of annual publications, of 258 papers found, 107 were included because they met the inclusion criteria. Three main applications were identified: synthesis and characterization, electronics and surfaces. The most relevant research on graphene inks has been summarized in this article, and graphene inks for electronic devices presented the most incident theme according to the research trends during the studied period. It is estimated that this theme will remain in evidence and will contribute to the direction of future research in this area.

Keywords: bibliometric, coating, nanomaterials, scientometrics

Procedia PDF Downloads 139
24723 Developing an Exhaustive and Objective Definition of Social Enterprise through Computer Aided Text Analysis

Authors: Deepika Verma, Runa Sarkar

Abstract:

One of the prominent debates in the social entrepreneurship literature has been to establish whether entrepreneurial work for social well-being by for-profit organizations can be classified as social entrepreneurship or not. Of late, the scholarship has reached a consensus. It concludes that there seems little sense in confining social entrepreneurship to just non-profit organizations. Boosted by this research, increasingly a lot of businesses engaged in filling the social infrastructure gaps in developing countries are calling themselves social enterprise. These organizations are diverse in their ownership, size, objectives, operations and business models. The lack of a comprehensive definition of social enterprise leads to three issues. Firstly, researchers may face difficulty in creating a database for social enterprises because the choice of an entity as a social enterprise becomes subjective or based on some pre-defined parameters by the researcher which is not replicable. Secondly, practitioners who use ‘social enterprise’ in their vision/mission statement(s) may find it difficult to adjust their business models accordingly especially during the times when they face the dilemma of choosing social well-being over business viability. Thirdly, social enterprise and social entrepreneurship attract a lot of donor funding and venture capital. In the paucity of a comprehensive definitional guide, the donors or investors may find assigning grants and investments difficult. It becomes necessary to develop an exhaustive and objective definition of social enterprise and examine whether the understanding of the academicians and practitioners about social enterprise match. This paper develops a dictionary of words often associated with social enterprise or (and) social entrepreneurship. It further compares two lexicographic definitions of social enterprise imputed from the abstracts of academic journal papers and trade publications extracted from the EBSCO database using the ‘tm’ package in R software.

Keywords: EBSCO database, lexicographic definition, social enterprise, text mining

Procedia PDF Downloads 359
24722 Local Texture and Global Color Descriptors for Content Based Image Retrieval

Authors: Tajinder Kaur, Anu Bala

Abstract:

An image retrieval system is a computer system for browsing, searching, and retrieving images from a large database of digital images a new algorithm meant for content-based image retrieval (CBIR) is presented in this paper. The proposed method combines the color and texture features which are extracted the global and local information of the image. The local texture feature is extracted by using local binary patterns (LBP), which are evaluated by taking into consideration of local difference between the center pixel and its neighbors. For the global color feature, the color histogram (CH) is used which is calculated by RGB (red, green, and blue) spaces separately. In this paper, the combination of color and texture features are proposed for content-based image retrieval. The performance of the proposed method is tested on Corel 1000 database which is the natural database. The results after being investigated show a significant improvement in terms of their evaluation measures as compared to LBP and CH.

Keywords: color, texture, feature extraction, local binary patterns, image retrieval

Procedia PDF Downloads 330
24721 Public Preferences for Lung Cancer Screening in China: A Discrete Choice Experiment

Authors: Zixuan Zhao, Lingbin Du, Le Wang, Youqing Wang, Yi Yang, Jingjun Chen, Hengjin Dong

Abstract:

Objectives: Few results from public attitudes for lung cancer screening are available both in China and abroad. This study aimed to identify preferred lung cancer screening modalities in a Chinese population and predict uptake rates of different modalities. Materials and Methods: A discrete choice experiment questionnaire was administered to 392 Chinese individuals aged 50–74 years who were at high risk for lung cancer. Each choice set had two lung screening options and an option to opt-out, and respondents were asked to choose the most preferred one. Both mixed logit analysis and stepwise logistic analysis were conducted to explore whether preferences were related to respondent characteristics and identify which kinds of respondents were more likely to opt out of any screening. Results: On mixed logit analysis, attributes that were predictive of choice at 1% level of statistical significance included the screening interval, screening venue, and out-of-pocket costs. The preferred screening modality seemed to be screening by low-dose computed tomography (LDCT) + blood test once a year in a general hospital at a cost of RMB 50; this could increase the uptake rate by 0.40 compared to the baseline setting. On stepwise logistic regression, those with no endowment insurance were more likely to opt out; those who were older and housewives/househusbands, and those with a health check habit and with commercial endowment insurance were less likely to opt out from a screening programme. Conclusions: There was considerable variance between real risk and self-perceived risk of lung cancer among respondents, and further research is required in this area. Lung cancer screening uptake can be increased by offering various screening modalities, so as to help policymakers further design the screening modality.

Keywords: lung cancer, screening, China., discrete choice experiment

Procedia PDF Downloads 219
24720 A Geometric Based Hybrid Approach for Facial Feature Localization

Authors: Priya Saha, Sourav Dey Roy Jr., Debotosh Bhattacharjee, Mita Nasipuri, Barin Kumar De, Mrinal Kanti Bhowmik

Abstract:

Biometric face recognition technology (FRT) has gained a lot of attention due to its extensive variety of applications in both security and non-security perspectives. It has come into view to provide a secure solution in identification and verification of person identity. Although other biometric based methods like fingerprint scans, iris scans are available, FRT is verified as an efficient technology for its user-friendliness and contact freeness. Accurate facial feature localization plays an important role for many facial analysis applications including biometrics and emotion recognition. But, there are certain factors, which make facial feature localization a challenging task. On human face, expressions can be seen from the subtle movements of facial muscles and influenced by internal emotional states. These non-rigid facial movements cause noticeable alterations in locations of facial landmarks, their usual shapes, which sometimes create occlusions in facial feature areas making face recognition as a difficult problem. The paper proposes a new hybrid based technique for automatic landmark detection in both neutral and expressive frontal and near frontal face images. The method uses the concept of thresholding, sequential searching and other image processing techniques for locating the landmark points on the face. Also, a Graphical User Interface (GUI) based software is designed that could automatically detect 16 landmark points around eyes, nose and mouth that are mostly affected by the changes in facial muscles. The proposed system has been tested on widely used JAFFE and Cohn Kanade database. Also, the system is tested on DeitY-TU face database which is created in the Biometrics Laboratory of Tripura University under the research project funded by Department of Electronics & Information Technology, Govt. of India. The performance of the proposed method has been done in terms of error measure and accuracy. The method has detection rate of 98.82% on JAFFE database, 91.27% on Cohn Kanade database and 93.05% on DeitY-TU database. Also, we have done comparative study of our proposed method with other techniques developed by other researchers. This paper will put into focus emotion-oriented systems through AU detection in future based on the located features.

Keywords: biometrics, face recognition, facial landmarks, image processing

Procedia PDF Downloads 380
24719 Analysis of the Effect of Increased Self-Awareness on the Amount of Food Thrown Away

Authors: Agnieszka Dubiel, Artur Grabowski, Tomasz Przerywacz, Mateusz Roganowicz, Patrycja Zioty

Abstract:

Food waste is one of the most significant challenges humanity is facing nowadays. Every year, reports from global organizations show the scale of the phenomenon, although society's awareness is still insufficient. One-third of the food produced in the world is wasted at various points in the food supply chain. Wastes are present from the delivery through the food preparation and distribution to the end of the sale and consumption. The first step in understanding and resisting the phenomenon is a thorough analysis of the everyday behaviors of humanity. This concept is understood as finding the correlation between the type of food and the reason for throwing it out and wasting it. Those actions were identified as a critical step in the start of work to develop technology to prevent food waste. In this paper, the problem mentioned above was analyzed by focusing on the inhabitants of Central Europe, especially Poland, aged 20-30. This paper provides an insight into collecting data through dedicated software and an organized database. The proposed database contains information on the amount, type, and reasons for wasting food in households. A literature review supported the work to answer research questions, compare the situation in Poland with the problem analyzed in other countries, and find research gaps. The proposed article examines the cause of food waste and its quantity in detail. This review complements previous reviews by emphasizing social and economic innovation in Poland's food waste management. The paper recommends a course of action for future research on food waste management and prevention related to the handling and disposal of food, emphasizing households, i.e., the last link in the supply chain.

Keywords: food waste, food waste reduction, consumer food waste, human-food interaction

Procedia PDF Downloads 76
24718 SIPTOX: Spider Toxin Database Information Repository System of Protein Toxins from Spiders by Using MySQL Method

Authors: Iftikhar Tayubi, Tabrej Khan, Rayan Alsulmi, Abdulrahman Labban

Abstract:

Spider produces a special kind of substance. This special kind of substance is called a toxin. The toxin is composed of many types of protein, which differs from species to species. Spider toxin consists of several proteins and non-proteins that include various categories of toxins like myotoxin, neurotoxin, cardiotoxin, dendrotoxin, haemorrhagins, and fibrinolytic enzyme. Protein Sequence information with references of toxins was derived from literature and public databases. From the previous findings, the Spider toxin would be the best choice to treat different types of tumors and cancer. There are many therapeutic regimes, which causes more side effects than treatment hence a different approach must be adopted for the treatment of cancer. The combinations of drugs are being encouraged, and dramatic outcomes are reported. Spider toxin is one of the natural cytotoxic compounds. Hence, it is being used to treat different types of tumors; especially its positive effect on breast cancer is being reported during the last few decades. The efficacy of this database is that it can provide a user-friendly interface for users to retrieve the information about Spiders, toxin and toxin protein of different Spiders species. SPIDTOXD provides a single source information about spider toxins, which will be useful for pharmacologists, neuroscientists, toxicologists, medicinal chemists. The well-ordered and accessible web interface allows users to explore the detail information of Spider and toxin proteins. It includes common name, scientific name, entry id, entry name, protein name and length of the protein sequence. The utility of this database is that it can provide a user-friendly interface for users to retrieve the information about Spider, toxin and toxin protein of different Spider species. The database interfaces will satisfy the demands of the scientific community by providing in-depth knowledge about Spider and its toxin. We have adopted the methodology by using A MySQL and PHP and for designing, we used the Smart Draw. The users can thus navigate from one section to another, depending on the field of interest of the user. This database contains a wealth of information on species, toxins, and clinical data, etc. This database will be useful for the scientific community, basic researchers and those interested in potential pharmaceutical Industry.

Keywords: siptoxd, php, mysql, toxin

Procedia PDF Downloads 147
24717 3D Objects Indexing Using Spherical Harmonic for Optimum Measurement Similarity

Authors: S. Hellam, Y. Oulahrir, F. El Mounchid, A. Sadiq, S. Mbarki

Abstract:

In this paper, we propose a method for three-dimensional (3-D)-model indexing based on defining a new descriptor, which we call new descriptor using spherical harmonics. The purpose of the method is to minimize, the processing time on the database of objects models and the searching time of similar objects to request object. Firstly we start by defining the new descriptor using a new division of 3-D object in a sphere. Then we define a new distance which will be used in the search for similar objects in the database.

Keywords: 3D indexation, spherical harmonic, similarity of 3D objects, measurement similarity

Procedia PDF Downloads 401
24716 Research Activity in Computational Science Using High Performance Computing: Co-Authorship Network Analysis

Authors: Sul-Ah Ahn, Youngim Jung

Abstract:

The research activities of the computational scientists using high-performance computing are analyzed using bibliometric approaches. This study aims at providing computational scientists using high-performance computing and relevant policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of computational scientists using high-performance computing as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2006-2015. We extracted the author rank in the computational science field using high-performance computing by the number of papers published during ten years from 2006. Finally, we drew the co-authorship network for 50 top-authors and their coauthors and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

Keywords: co-authorship network analysis, computational science, high performance computing, research activity

Procedia PDF Downloads 280
24715 Residual Analysis and Ground Motion Prediction Equation Ranking Metrics for Western Balkan Strong Motion Database

Authors: Manuela Villani, Anila Xhahysa, Christopher Brooks, Marco Pagani

Abstract:

The geological structure of Western Balkans is strongly affected by the collision between Adria microplate and the southwestern Euroasia margin, resulting in a considerably active seismic region. The Harmonization of Seismic Hazard Maps in the Western Balkan Countries Project (BSHAP) (2007-2011, 2012-2015) by NATO supported the preparation of new seismic hazard maps of the Western Balkan, but when inspecting the seismic hazard models produced later by these countries on a national scale, significant differences in design PGA values are observed in the border, for instance, North Albania-Montenegro, South Albania- Greece, etc. Considering the fact that the catalogues were unified and seismic sources were defined within BSHAP framework, obviously, the differences arise from the Ground Motion Prediction Equations selection, which are generally the component with highest impact on the seismic hazard assessment. At the time of the project, a modest database was present, namely 672 three-component records, whereas nowadays, this strong motion database has increased considerably up to 20,939 records with Mw ranging in the interval 3.7-7 and epicentral distance distribution from 0.47km to 490km. Statistical analysis of the strong motion database showed the lack of recordings in the moderate-to-large magnitude and short distance ranges; therefore, there is need to re-evaluate the Ground Motion Prediction Equation in light of the recently updated database and the new generations of GMMs. In some cases, it was observed that some events were more extensively documented in one database than the other, like the 1979 Montenegro earthquake, with a considerably larger number of records in the BSHAP Analogue SM database when compared to ESM23. Therefore, the strong motion flat-file provided from the Harmonization of Seismic Hazard Maps in the Western Balkan Countries Project was merged with the ESM23 database for the polygon studied in this project. After performing the preliminary residual analysis, the candidate GMPE-s were identified. This process was done using the GMPE performance metrics available within the SMT in the OpenQuake Platform. The Likelihood Model and Euclidean Distance Based Ranking (EDR) were used. Finally, for this study, a GMPE logic tree was selected and following the selection of candidate GMPEs, model weights were assigned using the average sample log-likelihood approach of Scherbaum.

Keywords: residual analysis, GMPE, western balkan, strong motion, openquake

Procedia PDF Downloads 45
24714 Analysis of the Relationship between the Old Days Hospitalized with Economic Lost Top Ten Age Productive Disease in Hospital Inpatient Inche Abdul Moeis Samarinda, Indonesia

Authors: Tri Murti Tugiman, Awalyya Fasha

Abstract:

This research aims to analyze the magnitude of the economic losses incurred as a result of a person suffering from a particular disease of the ten highest in the productive age diseases in Hospitals Inche Abdul Moeis Samarinda. This research was a descriptive survey research and a secondary data analysis. For the analysis of economic losses populations used are all in patients who suffer from the 10 highest diseases in the productive age in hospitals IA Moeis Samarinda in 2011. Sampling was performed by using a stratified random sampling with samples of 77 people. Research results indicate that the direct cost community incurred to obtain medical services in hospitals IA Moeis is IDR 74437520. The amount of indirect costs incurred during service in a community hospital is IDR 10562000. The amount lost due to sickness fee is IDR 5377800. The amount of economic lost people to obtain medical services in hospitals IA Moeis is IDR 90377320. The number of days of hospitalization was as much as 171 respondents throughout the day. This study suggests the economic loss could be prevented by changes in the lifestyle of the people who clean and healthy along with the following insurance.

Keywords: hospitalized, economic lost, productive age diseases, secondary data analysis

Procedia PDF Downloads 448
24713 Indoor Localization by Pattern Matching Method Based on Extended Database

Authors: Gyumin Hwang, Jihong Lee

Abstract:

This paper studied the CSS-based indoor localization system which is easy to implement, inexpensive to compose the systems, additionally CSS-based indoor localization system covers larger area than other system. However, this system has problem which is affected by reflected distance data. This problem in localization is caused by the multi-path effect. Error caused by multi-path is difficult to be corrected because the indoor environment cannot be described. In this paper, in order to solve the problem by multi-path, we have supplemented the localization system by using pattern matching method based on extended database. Thereby, this method improves precision of estimated. Also this method is verified by experiments in gymnasium. Database was constructed by 1 m intervals, and 16 sample data were collected from random position inside the region of DB points. As a result, this paper shows higher accuracy than existing method through graph and table.

Keywords: chirp spread spectrum, indoor localization, pattern-matching, time of arrival, multi-path, mahalanobis distance, reception rate, simultaneous localization and mapping, laser range finder

Procedia PDF Downloads 217
24712 Structure-Based Virtual Screening to Identify CLDN4 Inhibitors

Authors: Jayanthi Sivaraman

Abstract:

Claudins are the important components of the tight junctions that play a key role in paracellular permeability. Among various members of Claudin family, Claudin 4 (CLDN4) is found to be overexpressed in ovarian, pancreatic carcinomas and other epithelial malignancies. Therefore, in this study, an attempt has been made to identify potent inhibitors for CLDN4 from the ZINC database using virtual screening, molecular docking and molecular dynamics simulations. A well refined molecular model of CLDN4 was built using Prime of Schrodinger v10.2(Template- PDB ID: 4P79). Approximately, 6 million compounds from ZINC database are subjected to high-throughput virtual screening (HTVS) against the active site of CLDN4. Molecular docking using GLIDE predicted ARG31, ASN142, ASP146 and ARG158 as critically important residues. Furthermore, three compounds from ZINC database (ZINC96331839, ZINC36533519 and ZINC75819394) showed highly promising ADME properties and binding affinity with stable conformation. The therapeutic efficiency of these lead compounds is evaluated and confirmed by in-vitro and in-vivo studies which leads to the development of novel anti-cancer drugs.

Keywords: ADME property, inhibitors, molecular docking, virtual screening

Procedia PDF Downloads 304
24711 Challenges in Environmental Governance: A Case Study of Risk Perceptions of Environmental Agencies Involved in Flood Management in the Hawkesbury-Nepean Region, Australia

Authors: S. Masud, J. Merson, D. F. Robinson

Abstract:

The management of environmental resources requires engagement of a range of stakeholders including public/private agencies and different community groups to implement sustainable conservation practices. The challenge which is often ignored is the analysis of agencies involved and their power relations. One of the barriers identified is the difference in risk perceptions among the agencies involved that leads to disjointed efforts of assessing and managing risks. Wood et al 2012, explains that it is important to have an integrated approach to risk management where decision makers address stakeholder perspectives. This is critical for an effective risk management policy. This abstract is part of a PhD research that looks into barriers to flood management under a changing climate and intends to identify bottlenecks that create maladaptation. Experiences are drawn from international practices in the UK and examined in the context of Australia through exploring the flood governance in a highly flood-prone region in Australia: the Hawkesbury Ne-pean catchment as a case study. In this research study several aspects of governance and management are explored: (i) the complexities created by the way different agencies are involved in assessing flood risks (ii) different perceptions on acceptable flood risk level; (iii) perceptions on community engagement in defining acceptable flood risk level; (iv) Views on a holistic flood risk management approach; and, (v) challenges of centralised information system. The study concludes that the complexity of managing a large catchment is exacerbated by the difference in the way professionals perceive the problem. This has led to: (a) different standards for acceptable risks; (b) inconsistent attempt to set-up a regional scale flood management plan beyond the jurisdictional boundaries: (c) absence of a regional scale agency with license to share and update information (d) Lack of forums for dialogue with insurance companies to ensure an integrated approach to flood management. The research takes the Hawkesbury-Nepean catchment as case example and draws from literary evidence from around the world. In addition, conclusions were extrapolated from eighteen semi-structured interviews from agencies involved in flood risk management in the Hawkesbury-Nepean catchment of NSW, Australia. The outcome of this research is to provide a better understanding of complexity in assessing risks against a rapidly changing climate and contribute towards developing effective risk communication strategies thus enabling better management of floods and achieving increased level of support from insurance companies, real-estate agencies, state and regional risk managers and the affected communities.

Keywords: adaptive governance, flood management, flood risk communication, stakeholder risk perceptions

Procedia PDF Downloads 256
24710 Research Trends in Using Virtual Reality for the Analysis and Treatment of Lower-Limb Musculoskeletal Injury of Athletes: A Literature Review

Authors: Hannah K. M. Tang, Muhammad Ateeq, Mark J. Lake, Badr Abdullah, Frederic A. Bezombes

Abstract:

There is little research applying virtual reality (VR) to the treatment of musculoskeletal injury in athletes. This is despite their prevalence, and the implications for physical and psychological health. Nevertheless, developments of wireless VR headsets better facilitate dynamic movement in VR environments (VREs), and more research is expected in this emerging field. This systematic review identified publications that used VR interventions for the analysis or treatment of lower-limb musculoskeletal injury of athletes. It established a search protocol, and through narrative discussion, identified existing trends. Database searches encompassed four term sets: 1) VR systems; 2) musculoskeletal injuries; 3) sporting population; 4) movement outcome analysis. Overall, a total of 126 publications were identified through database searching, and twelve were included in the final analysis and discussion. Many of the studies were pilot and proof of concept work. Seven of the twelve publications were observational studies. However, this may provide preliminary data from which clinical trials will branch. If specified, the focus of the literature was very narrow, with very similar population demographics and injuries. The trends in the literature findings emphasised the role of VR and attentional focus, the strategic manipulation of movement outcomes, and the transfer of skill to the real-world. Causal inferences may have been undermined by flaws, as most studies were limited by the practicality of conducting a two-factor clinical-VR-based study. In conclusion, by assessing the exploratory studies, and combining this with the use of numerous developments, techniques, and tools, a novel application could be established to utilise VR with dynamic movement, for the effective treatment of specific musculoskeletal injuries of athletes.

Keywords: athletes, lower-limb musculoskeletal injury, rehabilitation, return-to-sport, virtual reality

Procedia PDF Downloads 207
24709 Design of an Automated Deep Learning Recurrent Neural Networks System Integrated with IoT for Anomaly Detection in Residential Electric Vehicle Charging in Smart Cities

Authors: Wanchalerm Patanacharoenwong, Panaya Sudta, Prachya Bumrungkun

Abstract:

The paper focuses on the development of a system that combines Internet of Things (IoT) technologies and deep learning algorithms for anomaly detection in residential Electric Vehicle (EV) charging in smart cities. With the increasing number of EVs, ensuring efficient and reliable charging systems has become crucial. The aim of this research is to develop an integrated IoT and deep learning system for detecting anomalies in residential EV charging and enhancing EV load profiling and event detection in smart cities. This approach utilizes IoT devices equipped with infrared cameras to collect thermal images and household EV charging profiles from the database of Thailand utility, subsequently transmitting this data to a cloud database for comprehensive analysis. The methodology includes the use of advanced deep learning techniques such as Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM) algorithms. IoT devices equipped with infrared cameras are used to collect thermal images and EV charging profiles. The data is transmitted to a cloud database for comprehensive analysis. The researchers also utilize feature-based Gaussian mixture models for EV load profiling and event detection. Moreover, the research findings demonstrate the effectiveness of the developed system in detecting anomalies and critical profiles in EV charging behavior. The system provides timely alarms to users regarding potential issues and categorizes the severity of detected problems based on a health index for each charging device. The system also outperforms existing models in event detection accuracy. This research contributes to the field by showcasing the potential of integrating IoT and deep learning techniques in managing residential EV charging in smart cities. The system ensures operational safety and efficiency while also promoting sustainable energy management. The data is collected using IoT devices equipped with infrared cameras and is stored in a cloud database for analysis. The collected data is then analyzed using RNN, LSTM, and feature-based Gaussian mixture models. The approach includes both EV load profiling and event detection, utilizing a feature-based Gaussian mixture model. This comprehensive method aids in identifying unique power consumption patterns among EV owners and outperforms existing models in event detection accuracy. In summary, the research concludes that integrating IoT and deep learning techniques can effectively detect anomalies in residential EV charging and enhance EV load profiling and event detection accuracy. The developed system ensures operational safety and efficiency, contributing to sustainable energy management in smart cities.

Keywords: cloud computing framework, recurrent neural networks, long short-term memory, Iot, EV charging, smart grids

Procedia PDF Downloads 27
24708 Decision Support System for Examination Selection

Authors: Katejarinporn Chaiya, Jarumon Nookong, Nutthapat Kaewrattanapat

Abstract:

The purposes of this research were to develop and find users’ satisfaction after using the Decision Support System for Examination Selection. This research presents the design of information systems. In order to find the necessary examination of the statistics. Based on the examination of the candidate and then taking the easy difficulty setting statistics applied to the test. In addition, research has also made performance appraisals from experts and user satisfaction. By results of analysis showed that the performance appraisals from experts on the system as a whole and at a good level. mean was 3.44 and S.D. was 0.55 and user satisfaction per system as a whole and the good level mean was 3.37 and S.D. was 0.42 can conclude that effective systems are in a good level. Work has been completed in accordance with the scope of work. The website used developing this project is PHP, MySQL.5.0.45 for database.

Keywords: secision support system, examination, PHP, information systems

Procedia PDF Downloads 420
24707 Complex Technology of Virtual Reconstruction: The Case of Kazan Imperial University of XIX-Early XX Centuries

Authors: L. K. Karimova, K. I. Shariukova, A. A. Kirpichnikova, E. A. Razuvalova

Abstract:

This article deals with technology of virtual reconstruction of Kazan Imperial University of XIX - early XX centuries. The paper describes technologies of 3D-visualization of high-resolution models of objects of university space, creation of multi-agent system and connected with these objects organized database of historical sources, variants of use of technologies of immersion into the virtual environment.

Keywords: 3D-reconstruction, multi-agent system, database, university space, virtual reconstruction, virtual heritage

Procedia PDF Downloads 239
24706 A Retrospective Study: Correlation between Enterococcus Infections and Bone Carcinoma Incidence

Authors: Sonia A. Stoica, Lexi Frankel, Amalia Ardeljan, Selena Rashid, Ali Yasback, Omar Rashid

Abstract:

Introduction Enterococcus is a vast genus of lactic acid bacteria, gram-positivecocci species. They are common commensal organisms in the intestines of humans: E. faecalis (90–95%) and E. faecium (5–10%). Rare groups of infections can occur with other species, including E. casseliflavus, E. gallinarum, and E. raffinosus. The most common infections caused by Enterococcus include urinary tract infections, biliary tract infections, subacute endocarditis, diverticulitis, meningitis, septicemia, and spontaneous bacterial peritonitis. The treatment for sensitive strains of these bacteria includes ampicillin, penicillin, cephalosporins, or vancomycin, while the treatment for resistant strains includes daptomycin, linezolid, tygecycline, or streptogramine. Enterococcus faecalis CECT7121 is an encouraging nominee for being considered as a probiotic strain. E. faecalis CECT7121 enhances and skews the profile of cytokines to the Th1 phenotype in situations such as vaccination, anti-tumoral immunity, and allergic reactions. It also enhances the secretion of high levels of IL-12, IL-6, TNF alpha, and IL-10. Cytokines have been previously associated with the development of cancer. The intention of this study was to therefore evaluate the correlation between Enterococcus infections and incidence of bone carcinoma. Methods A retrospective cohort study (2010-2019) was conducted through a Health Insurance Portability and Accountability Act (HIPAA) compliant national database and conducted using International Classification of Disease (ICD) 9th and 10th codes for bone carcinoma diagnosis in a previously Enterococcus infected population. Patients were matched for age range and Charlson Comorbidity Index (CCI). Access to the database was granted by Holy Cross Health for academic research. Chi-squared test was used to assess statistical significance. Results A total number of 17,056 patients was obtained in Enterococcus infected group as well as in the control population (matched by Age range and CCI score). Subsequent bone carcinoma development was seen at a rate of 1.07% (184) in the Enterococcal infectious group and 3.42% (584) in the control group, respectively. The difference was statistically significant by p= 2.2x10-¹⁶, Odds Ratio = 0.355 (95% CI 0.311 - 0.404) Treatment for enterococcus infection was analyzed and controlled for in both enterococcus infected and noninfected populations. 78 out of 6,624 (1.17%) patients with a prior enterococcus infection and treated with antibiotics were compared to 202 out of 6,624 (3.04%) patients with no history of enterococcus infection (control) and received antibiotic treatment. Both populations subsequently developed bone carcinoma. Results remained statistically significant (p<2.2x10-), Odds Ratio=0.456 (95% CI 0.396-0.525). Conclusion This study shows a statistically significant correlation between Enterococcus infection and a decreased incidence of bone carcinoma. The immunologic response of the organism to Enterococcus infection may exert a protecting mechanism from developing bone carcinoma. Further exploration is needed to identify the potential mechanism of Enterococcus in reducing bone carcinoma incidence.

Keywords: anti-tumoral immunity, bone carcinoma, enterococcus, immunologic response

Procedia PDF Downloads 153
24705 Research Trends on Magnetic Graphene for Water Treatment: A Bibliometric Analysis

Authors: J. C. M. Santos, J. C. A. Sousa, A. J. Rubio, L. S. Soletti, F. Gasparotto, N. U. Yamaguchi

Abstract:

Magnetic graphene has received widespread attention for their capability of water and wastewater treatment, which has been attracted many researchers in this field. A bibliometric analysis based on the Web of Science database was employed to analyze the global scientific outputs of magnetic graphene for water treatment until the present time (2012 to 2017), to improve the understanding of the research trends. The publication year, place of publication, institutes, funding agencies, journals, most cited articles, distribution outputs in thematic categories and applications were analyzed. Three major aspects analyzed including type of pollutant, treatment process and composite composition have further contributed to revealing the research trends. The most relevant research aspects of the main technologies using magnetic graphene for water treatment were summarized in this paper. The results showed that research on magnetic graphene for water treatment goes through a period of decline that might be related to a saturated field and a lack of bibliometric studies. Thus, the result of the present work will lead researchers to establish future directions in further studies using magnetic graphene for water treatment.

Keywords: composite, graphene oxide, nanomaterials, scientometrics

Procedia PDF Downloads 221
24704 The Search for an Alternative to Tabarru` in Takaful Models

Authors: Abu Umar Faruq Ahmad, Muhammad Ayub

Abstract:

Tabarru` (unilateral gratuitous contribution) is thought to be the basic concept that distinguishes Takaful from conventional non-Sharīʿah compliant insurance. The Sharīʿah compliance of its current practice has been questioned in the premise that, a) it is a form of commutative contract; b) it is akin to the commercial corporate structure of insurance companies due to following the same marketing strategies, allocation to reserves, sharing of underwriting surplus by the companies one way or the other, providing loans to the Takaful funds, and resultantly absorbing the underwriting losses. The Sharīʿah scholars are of the view that the relationship between participants in Takaful should be in the form of commitment to donate, under which a contributor makes commitments himself to donate a sum of money for mutual help and cooperation on the condition that the balance, if any, should be returned to him. With the aim of finding solutions to the above mentioned concerns and other Sharīʿah related issues the study seeks to investigate whether the Takaful companies are functioning in accordance with the Islamic principles of brotherhood, solidarity, and cooperative risk sharing. Given that it discusses the cooperative model of Takaful to address the current and future Sharīʿah related and legal concerns. The study proposed an alternative model and considers it to best serve the objectives of Takaful which operates on the basis of ta`awun or mutual co-operation.

Keywords: hibah, musharakah ta`awuniyyah, Tabarru`, Takaful

Procedia PDF Downloads 415
24703 Wireless Sensor Network to Help Low Incomes Farmers to Face Drought Impacts

Authors: Fantazi Walid, Ezzedine Tahar, Bargaoui Zoubeida

Abstract:

This research presents the main ideas to implement an intelligent system composed by communicating wireless sensors measuring environmental data linked to drought indicators (such as air temperature, soil moisture , etc...). On the other hand, the setting up of a spatio temporal database communicating with a Web mapping application for a monitoring in real time in activity 24:00 /day, 7 days/week is proposed to allow the screening of the drought parameters time evolution and their extraction. Thus this system helps detecting surfaces touched by the phenomenon of drought. Spatio-temporal conceptual models seek to answer the users who need to manage soil water content for irrigating or fertilizing or other activities pursuing crop yield augmentation. Effectively, spatio-temporal conceptual models enable users to obtain a diagram of readable and easy data to apprehend. Based on socio-economic information, it helps identifying people impacted by the phenomena with the corresponding severity especially that this information is accessible by farmers and stakeholders themselves. The study will be applied in Siliana watershed Northern Tunisia.

Keywords: WSN, database spatio-temporal, GIS, web mapping, indicator of drought

Procedia PDF Downloads 461
24702 Calculation of Methane Emissions from Wetlands in Slovakia via IPCC Methodology

Authors: Jozef Mindas, Jana Skvareninova

Abstract:

Wetlands are a main natural source of methane emissions, but they also represent the important biodiversity reservoirs in the landscape. There are about 26 thousands hectares of wetlands in Slovakia identified via the wetlands monitoring program. Created database of wetlands in Slovakia allows to analyze several ecological processes including also the methane emissions estimate. Based on the information from the database, the first estimate of the methane emissions from wetlands in Slovakia has been done. The IPCC methodology (Tier 1 approach) has been used with proposed emission factors for the ice-free period derived from the climatic data. The highest methane emissions of nearly 550 Gg are associated with the category of fens. Almost 11 Gg of methane is emitted from bogs, and emissions from flooded lands represent less than 8 Gg.

Keywords: bogs, methane emissions, Slovakia, wetlands

Procedia PDF Downloads 257
24701 Digital Development of Cultural Heritage: Construction of Traditional Chinese Pattern Database

Authors: Shaojian Li

Abstract:

The traditional Chinese patterns, as an integral part of Chinese culture, possess unique values in history, culture, and art. However, with the passage of time and societal changes, many of these traditional patterns are at risk of being lost, damaged, or forgotten. To undertake the digital preservation and protection of these traditional patterns, this paper will collect and organize images of traditional Chinese patterns. It will provide exhaustive and comprehensive semantic annotations, creating a resource library of traditional Chinese pattern images. This will support the digital preservation and application of traditional Chinese patterns.

Keywords: digitization of cultural heritage, traditional Chinese patterns, digital humanities, database construction

Procedia PDF Downloads 25
24700 Ontology-Driven Knowledge Discovery and Validation from Admission Databases: A Structural Causal Model Approach for Polytechnic Education in Nigeria

Authors: Bernard Igoche Igoche, Olumuyiwa Matthew, Peter Bednar, Alexander Gegov

Abstract:

This study presents an ontology-driven approach for knowledge discovery and validation from admission databases in Nigerian polytechnic institutions. The research aims to address the challenges of extracting meaningful insights from vast amounts of admission data and utilizing them for decision-making and process improvement. The proposed methodology combines the knowledge discovery in databases (KDD) process with a structural causal model (SCM) ontological framework. The admission database of Benue State Polytechnic Ugbokolo (Benpoly) is used as a case study. The KDD process is employed to mine and distill knowledge from the database, while the SCM ontology is designed to identify and validate the important features of the admission process. The SCM validation is performed using the conditional independence test (CIT) criteria, and an algorithm is developed to implement the validation process. The identified features are then used for machine learning (ML) modeling and prediction of admission status. The results demonstrate the adequacy of the SCM ontological framework in representing the admission process and the high predictive accuracies achieved by the ML models, with k-nearest neighbors (KNN) and support vector machine (SVM) achieving 92% accuracy. The study concludes that the proposed ontology-driven approach contributes to the advancement of educational data mining and provides a foundation for future research in this domain.

Keywords: admission databases, educational data mining, machine learning, ontology-driven knowledge discovery, polytechnic education, structural causal model

Procedia PDF Downloads 23
24699 Prioritization in Modern Portfolio Management - An Action Design Research Approach to Method Development for Scaled Agility

Authors: Jan-Philipp Schiele, Karsten Schlinkmeier

Abstract:

Allocation of scarce resources is a core process of traditional project portfolio management. However, with the popularity of agile methodology, established concepts and methods of portfolio management are reaching their limits and need to be adapted. Consequently, the question arises of how the process of resource allocation can be managed appropriately in scaled agile environments. The prevailing framework SAFe offers Weightest Shortest Job First (WSJF) as a prioritization technique, butestablished companies are still looking for methodical adaptions to apply WSJF for prioritization in portfolios in a more goal-oriented way and aligned for their needs in practice. In this paper, the relevant problem of prioritization in portfolios is conceptualized from the perspective of coordination and related mechanisms to support resource allocation. Further, an Action Design Research (ADR) project with case studies in a finance company is outlined to develop a practically applicable yet scientifically sound prioritization method based on coordination theory. The ADR project will be flanked by consortium research with various practitioners from the financial and insurance industry. Preliminary design requirements indicate that the use of a feedback loop leads to better team and executive level coordination in the prioritization process.

Keywords: scaled agility, portfolio management, prioritization, business-IT alignment

Procedia PDF Downloads 167