Search results for: algorithm integration
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5930

Search results for: algorithm integration

1190 An Electrocardiography Deep Learning Model to Detect Atrial Fibrillation on Clinical Application

Authors: Jui-Chien Hsieh

Abstract:

Background:12-lead electrocardiography(ECG) is one of frequently-used tools to detect atrial fibrillation (AF), which might degenerate into life-threaten stroke, in clinical Practice. Based on this study, the AF detection by the clinically-used 12-lead ECG device has only 0.73~0.77 positive predictive value (ppv). Objective: It is on great demand to develop a new algorithm to improve the precision of AF detection using 12-lead ECG. Due to the progress on artificial intelligence (AI), we develop an ECG deep model that has the ability to recognize AF patterns and reduce false-positive errors. Methods: In this study, (1) 570-sample 12-lead ECG reports whose computer interpretation by the ECG device was AF were collected as the training dataset. The ECG reports were interpreted by 2 senior cardiologists, and confirmed that the precision of AF detection by the ECG device is 0.73.; (2) 88 12-lead ECG reports whose computer interpretation generated by the ECG device was AF were used as test dataset. Cardiologist confirmed that 68 cases of 88 reports were AF, and others were not AF. The precision of AF detection by ECG device is about 0.77; (3) A parallel 4-layer 1 dimensional convolutional neural network (CNN) was developed to identify AF based on limb-lead ECGs and chest-lead ECGs. Results: The results indicated that this model has better performance on AF detection than traditional computer interpretation of the ECG device in 88 test samples with 0.94 ppv, 0.98 sensitivity, 0.80 specificity. Conclusions: As compared to the clinical ECG device, this AI ECG model promotes the precision of AF detection from 0.77 to 0.94, and can generate impacts on clinical applications.

Keywords: 12-lead ECG, atrial fibrillation, deep learning, convolutional neural network

Procedia PDF Downloads 104
1189 Integrating a Universal Forensic DNA Database: Anticipated Deterrent Effects

Authors: Karen Fang

Abstract:

Investigative genetic genealogy has attracted much interest in both the field of ethics and the public eye due to its global application in criminal cases. Arguments have been made regarding privacy and informed consent, especially with law enforcement using consumer genetic testing results to convict individuals. In the case of public interest, DNA databases have the strong potential to significantly reduce crime, which in turn leads to safer communities and better futures. With the advancement of genetic technologies, the integration of a universal forensic DNA database in violent crimes, crimes against children, and missing person cases is expected to deter crime while protecting one’s privacy. Rather than collecting whole genomes from the whole population, STR profiles can be used to identify unrelated individuals without compromising personal information such as physical appearance, disease risk, and geographical origin, and additionally, reduce cost and storage space. STR DNA profiling is already used in the forensic science field and going a step further benefits several areas, including the reduction in recidivism, improved criminal court case turnaround time, and just punishment. Furthermore, adding individuals to the database as early as possible prevents young offenders and first-time offenders from participating in criminal activity. It is important to highlight that DNA databases should be inclusive and tightly governed, and the misconception on the use of DNA based on crime television series and other media sources should be addressed. Nonetheless, deterrent effects have been observed in countries like the US and Denmark with DNA databases that consist of serious violent offenders. Fewer crimes were reported, and fewer people were convicted of those crimes- a favorable outcome, not even the death penalty could provide. Currently, there is no better alternative than a universal forensic DNA database made up of STR profiles. It can open doors for investigative genetic genealogy and fostering better communities. Expanding the appropriate use of DNA databases is ethically acceptable and positively impacts the public.

Keywords: bioethics, deterrent effects, DNA database, investigative genetic genealogy, privacy, public interest

Procedia PDF Downloads 139
1188 Energy Management Method in DC Microgrid Based on the Equivalent Hydrogen Consumption Minimum Strategy

Authors: Ying Han, Weirong Chen, Qi Li

Abstract:

An energy management method based on equivalent hydrogen consumption minimum strategy is proposed in this paper aiming at the direct-current (DC) microgrid consisting of photovoltaic cells, fuel cells, energy storage devices, converters and DC loads. The rational allocation of fuel cells and battery devices is achieved by adopting equivalent minimum hydrogen consumption strategy with the full use of power generated by photovoltaic cells. Considering the balance of the battery’s state of charge (SOC), the optimal power of the battery under different SOC conditions is obtained and the reference output power of the fuel cell is calculated. And then a droop control method based on time-varying droop coefficient is proposed to realize the automatic charge and discharge control of the battery, balance the system power and maintain the bus voltage. The proposed control strategy is verified by RT-LAB hardware-in-the-loop simulation platform. The simulation results show that the designed control algorithm can realize the rational allocation of DC micro-grid energy and improve the stability of system.

Keywords: DC microgrid, equivalent minimum hydrogen consumption strategy, energy management, time-varying droop coefficient, droop control

Procedia PDF Downloads 291
1187 Two-stage Robust Optimization for Collaborative Distribution Network Design Under Uncertainty

Authors: Reza Alikhani

Abstract:

This research focuses on the establishment of horizontal cooperation among companies to enhance their operational efficiency and competitiveness. The study proposes an approach to horizontal collaboration, called coalition configuration, which involves partnering companies sharing distribution centers in a network design problem. The paper investigates which coalition should be formed in each distribution center to minimize the total cost of the network. Moreover, potential uncertainties, such as operational and disruption risks, are considered during the collaborative design phase. To address this problem, a two-stage robust optimization model for collaborative distribution network design under surging demand and facility disruptions is presented, along with a column-and-constraint generation algorithm to obtain exact solutions tailored to the proposed formulation. Extensive numerical experiments are conducted to analyze solutions obtained by the model in various scenarios, including decisions ranging from fully centralized to fully decentralized settings, collaborative versus non-collaborative approaches, and different amounts of uncertainty budgets. The results show that the coalition formation mechanism proposes some solutions that are competitive with the savings of the grand coalition. The research also highlights that collaboration increases network flexibility and resilience while reducing costs associated with demand and capacity uncertainties.

Keywords: logistics, warehouse sharing, robust facility location, collaboration for resilience

Procedia PDF Downloads 53
1186 Distances over Incomplete Diabetes and Breast Cancer Data Based on Bhattacharyya Distance

Authors: Loai AbdAllah, Mahmoud Kaiyal

Abstract:

Missing values in real-world datasets are a common problem. Many algorithms were developed to deal with this problem, most of them replace the missing values with a fixed value that was computed based on the observed values. In our work, we used a distance function based on Bhattacharyya distance to measure the distance between objects with missing values. Bhattacharyya distance, which measures the similarity of two probability distributions. The proposed distance distinguishes between known and unknown values. Where the distance between two known values is the Mahalanobis distance. When, on the other hand, one of them is missing the distance is computed based on the distribution of the known values, for the coordinate that contains the missing value. This method was integrated with Wikaya, a digital health company developing a platform that helps to improve prevention of chronic diseases such as diabetes and cancer. In order for Wikaya’s recommendation system to work distance between users need to be measured. Since there are missing values in the collected data, there is a need to develop a distance function distances between incomplete users profiles. To evaluate the accuracy of the proposed distance function in reflecting the actual similarity between different objects, when some of them contain missing values, we integrated it within the framework of k nearest neighbors (kNN) classifier, since its computation is based only on the similarity between objects. To validate this, we ran the algorithm over diabetes and breast cancer datasets, standard benchmark datasets from the UCI repository. Our experiments show that kNN classifier using our proposed distance function outperforms the kNN using other existing methods.

Keywords: missing values, incomplete data, distance, incomplete diabetes data

Procedia PDF Downloads 204
1185 Between Buddha and Tsar: Kalmyk Buddhist Sangha in Late Russian Empire

Authors: Elzyata Kuberlinova

Abstract:

This study explores how the Kalmyk Buddhist sangha responded to the Russian empire’s administrative integration and how the Buddhist clerical institutions were shaped in the process of interaction with representatives of the predominantly Orthodox state. The eighteenth-nineteenth century Russian imperial regime adhered to a religion-centred framework to govern its diverse subjects. Within this framework, any form of religious authority was considered a useful tool in the imperial quest for legibility. As such, rather than imposing religious homogeneity, the Russian administration engineered a framework of religious toleration and integrated the non-Orthodox clerical institutions in the empire’s administration. In its attempt to govern the large body of Kalmyk Buddhist sangha, the Russian government had to incorporate the sangha into the imperial institutional establishment. To this end, the Russian government founded the Lamaist Spiritual Governing Board in 1834, which became a part of the civil administration, where the Kalmyk Buddhist affairs were managed under the supervision of the Russian secular authorities. In 1847 the Lamaist Spiritual Board was abolished and Buddhist religious authority was transferred to the Lama of the Kalmyk people. From 1847 until the end of the empire in 1917 the Lama was the manager and intermediary figure between the Russian authorities and the Kalmyks where religious affairs were concerned. Substantial evidence collected in archives in Elista, Astrakhan, Stavropol and St.Petersburg show that despite being on the government’s payroll, first the Lamaist Spiritual Governing Board and later on the Lama did not always serve the interests of the state, and did not always comply with the Russian authorities’ orders. Although being incorporated into the state administrative system the Lama often found ways to manoeuvre the web of the Russian imperial bureaucracy in order to achieve his own goals. The Lama often used ‘every-day forms of resistance’ such as feigned misinterpretation, evasion, false compliance, feigned ignorance, and sabotage in order to resist without directly confronting or challenging the state orders.

Keywords: Buddhist Sangha, intermediary, Kalmyks, Lama, legibility, resistance, reform, Russian empire

Procedia PDF Downloads 206
1184 Parkinson’s Disease Detection Analysis through Machine Learning Approaches

Authors: Muhtasim Shafi Kader, Fizar Ahmed, Annesha Acharjee

Abstract:

Machine learning and data mining are crucial in health care, as well as medical information and detection. Machine learning approaches are now being utilized to improve awareness of a variety of critical health issues, including diabetes detection, neuron cell tumor diagnosis, COVID 19 identification, and so on. Parkinson’s disease is basically a disease for our senior citizens in Bangladesh. Parkinson's Disease indications often seem progressive and get worst with time. People got affected trouble walking and communicating with the condition advances. Patients can also have psychological and social vagaries, nap problems, hopelessness, reminiscence loss, and weariness. Parkinson's disease can happen in both men and women. Though men are affected by the illness at a proportion that is around partial of them are women. In this research, we have to get out the accurate ML algorithm to find out the disease with a predictable dataset and the model of the following machine learning classifiers. Therefore, nine ML classifiers are secondhand to portion study to use machine learning approaches like as follows, Naive Bayes, Adaptive Boosting, Bagging Classifier, Decision Tree Classifier, Random Forest classifier, XBG Classifier, K Nearest Neighbor Classifier, Support Vector Machine Classifier, and Gradient Boosting Classifier are used.

Keywords: naive bayes, adaptive boosting, bagging classifier, decision tree classifier, random forest classifier, XBG classifier, k nearest neighbor classifier, support vector classifier, gradient boosting classifier

Procedia PDF Downloads 119
1183 Embodied Spirituality in Gestalt Therapy

Authors: Silvia Alaimo

Abstract:

This lecture brings to our attention the theme of spirituality within Gestalt therapy’s theoretical and clinical perspectives and which is closely connected to the fertile emptiness and creative indifference’ experiences. First of all, the premise that must be done is the overcoming traditional western culture’s philosophical and religious misunderstandings, such as the dicotomy between spirituality and pratical/material daily life, as well as the widespread secular perspective of classic psychology. Even fullness and emptiness have traditionally been associated with the concepts of being and not being. "There is only one way through which we can contact the deepest layers of our existence, rejuvenate our thinking and reach intuition (the harmony of thought and being): inner silence" (Perls) *. Therefore, "fertile void" doesn't mean empty in itself, but rather an useful condition of every creative and responsible act, making room for a deeper dimension close to spirituality. Spirituality concerns questions about the meaning of existence, which lays beyond the concrete and literal dimension, looking for the essence of things, and looking at the value of personal experience. Looking at fundamentals of Gestalt epistemology, phenomenology, aesthetics, and the relationship, we can reach the heart of a therapeutic work that takes spiritual contours and which are based on an embodied (incarnate size), through the relational aesthetic knowledge (Spagnuolo Lobb ), the deep contact with each other, the role of compassion and responsibility, as the patient's recognition criteria (Orange, 2013) rooted in the body. The aesthetic dimension, like the spiritual dimension to which it is often associated, is a subtle dimension: it is the dimension of the essence of things, of their "soul." In clinical practice, it implies that the relationship between therapist and patient is "in the absence of judgment," also called "zero point of creative indifference," expressed by ‘therapeutic mentality’. It consists in following with interest and authentic curiosity where the patient wants to go and support him in his intentionality of contact. It’s a condition of pure and simple awareness, of the full acceptance of "what is," a moment of detachment from one's own life in which one does not take oneself too seriously, a starting point for finding a center of balance and integration that brings to the creative act, to growth, and, as Perls would say, to the excitement and adventure of living.

Keywords: spirituality, bodily, embodied aesthetics, phenomenology, relationship

Procedia PDF Downloads 129
1182 Healthcare-SignNet: Advanced Video Classification for Medical Sign Language Recognition Using CNN and RNN Models

Authors: Chithra A. V., Somoshree Datta, Sandeep Nithyanandan

Abstract:

Sign Language Recognition (SLR) is the process of interpreting and translating sign language into spoken or written language using technological systems. It involves recognizing hand gestures, facial expressions, and body movements that makeup sign language communication. The primary goal of SLR is to facilitate communication between hearing- and speech-impaired communities and those who do not understand sign language. Due to the increased awareness and greater recognition of the rights and needs of the hearing- and speech-impaired community, sign language recognition has gained significant importance over the past 10 years. Technological advancements in the fields of Artificial Intelligence and Machine Learning have made it more practical and feasible to create accurate SLR systems. This paper presents a distinct approach to SLR by framing it as a video classification problem using Deep Learning (DL), whereby a combination of Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs) has been used. This research targets the integration of sign language recognition into healthcare settings, aiming to improve communication between medical professionals and patients with hearing impairments. The spatial features from each video frame are extracted using a CNN, which captures essential elements such as hand shapes, movements, and facial expressions. These features are then fed into an RNN network that learns the temporal dependencies and patterns inherent in sign language sequences. The INCLUDE dataset has been enhanced with more videos from the healthcare domain and the model is evaluated on the same. Our model achieves 91% accuracy, representing state-of-the-art performance in this domain. The results highlight the effectiveness of treating SLR as a video classification task with the CNN-RNN architecture. This approach not only improves recognition accuracy but also offers a scalable solution for real-time SLR applications, significantly advancing the field of accessible communication technologies.

Keywords: sign language recognition, deep learning, convolution neural network, recurrent neural network

Procedia PDF Downloads 7
1181 Energy Trading for Cooperative Microgrids with Renewable Energy Resources

Authors: Ziaullah, Shah Wahab Ali

Abstract:

Micro-grid equipped with heterogeneous energy resources present the idea of small scale distributed energy management (DEM). DEM helps in minimizing the transmission and operation costs, power management and peak load demands. Micro-grids are collections of small, independent controllable power-generating units and renewable energy resources. Micro-grids also motivate to enable active customer participation by giving accessibility of real-time information and control to the customer. The capability of fast restoration against faulty situation, integration of renewable energy resources and Information and Communication Technologies (ICT) make micro-grid as an ideal system for distributed power systems. Micro-grids can have a bank of energy storage devices. The energy management system of micro-grid can perform real-time energy forecasting of renewable resources, energy storage elements and controllable loads in making proper short-term scheduling to minimize total operating costs. We present a review of existing micro-grids optimization objectives/goals, constraints, solution approaches and tools used in micro-grids for energy management. Cost-benefit analysis of micro-grid reveals that cooperation among different micro-grids can play a vital role in the reduction of import energy cost and system stability. Cooperative micro-grids energy trading is an approach to electrical distribution energy resources that allows local energy demands more control over the optimization of power resources and uses. Cooperation among different micro-grids brings the interconnectivity and power trading issues. According to the literature, it shows that open area of research is available for cooperative micro-grids energy trading. In this paper, we proposed and formulated the efficient energy management/trading module for interconnected micro-grids. It is believed that this research will open new directions in future for energy trading in cooperative micro-grids/interconnected micro-grids.

Keywords: distributed energy management, information and communication technologies, microgrid, energy management

Procedia PDF Downloads 362
1180 Global Navigation Satellite System and Precise Point Positioning as Remote Sensing Tools for Monitoring Tropospheric Water Vapor

Authors: Panupong Makvichian

Abstract:

Global Navigation Satellite System (GNSS) is nowadays a common technology that improves navigation functions in our life. Additionally, GNSS is also being employed on behalf of an accurate atmospheric sensor these times. Meteorology is a practical application of GNSS, which is unnoticeable in the background of people’s life. GNSS Precise Point Positioning (PPP) is a positioning method that requires data from a single dual-frequency receiver and precise information about satellite positions and satellite clocks. In addition, careful attention to mitigate various error sources is required. All the above data are combined in a sophisticated mathematical algorithm. At this point, the research is going to demonstrate how GNSS and PPP method is capable to provide high-precision estimates, such as 3D positions or Zenith tropospheric delays (ZTDs). ZTDs combined with pressure and temperature information allows us to estimate the water vapor in the atmosphere as precipitable water vapor (PWV). If the process is replicated for a network of GNSS sensors, we can create thematic maps that allow extract water content information in any location within the network area. All of the above are possible thanks to the advances in GNSS data processing. Therefore, we are able to use GNSS data for climatic trend analysis and acquisition of the further knowledge about the atmospheric water content.

Keywords: GNSS, precise point positioning, Zenith tropospheric delays, precipitable water vapor

Procedia PDF Downloads 184
1179 Teaching English in Low Resource-Environments: Problems and Prospects

Authors: Gift Chidi-Onwuta, Iwe Nkem Nkechinyere, Chikamadu Christabelle Chinyere

Abstract:

The teaching of English is a resource-driven activity that requires rich resource-classroom settings for the delivery of effective lessons and the acquisition of interpersonal skills for integration in a target-language environment. However, throughout the world, English is often taught in low-resource classrooms. This paper is aimed to reveal the common problems associated with teaching English in low-resource environments and the prospects for teachers who found themselves in such undefined teaching settings. Self-structured and validated questionnaire in a closed-ended format, open question format and scaling format was administered to teachers across five countries: Nigeria, Cameroun, Iraq, Turkey, and Sudan. The study adopts situational language teaching theory (SLTT), which emphasizes a performance improvement imperative. This study inclines to this model because it maintains that learning must be fun and enjoyable like playing a favorite sport, just as in real life. Since teaching resources make learning engaging, we found this model apt for the current study. The perceptions of teachers about accessibility and functionality of teaching material resources, the nature of teaching outcomes in resource-less environments, their levels of involvement in improvisation and the prospects associated with resource limitations were sourced. Data were analysed using percentages and presented in frequency tables. Results: showed that a greater number of teachers across these nations do not have access to sufficient productive resource materials that can aid effective English language teaching. Teaching outcomes, from the findings, are affected by low material resources; however, results show certain advantages to teaching English with limited resources: flexibility and autonomy with students and creativity and innovation amongst teachers. Results further revealed group work, story, critical thinking strategy, flex, cardboards and flashcards, dictation and dramatization as common teaching strategies, as well as materials adopted by teachers to overcome low resource-related challenges in classrooms.

Keywords: teaching materials, low-resource environments, English language teaching, situational language theory

Procedia PDF Downloads 115
1178 Do Interventions for Increasing Minorities' Access to Higher Education Work? The Case of Ethiopians in Israel

Authors: F. Nasser-Abu Alhija

Abstract:

In many countries, much efforts and resources are devoted to empowering and integrating minorities within the mainstream population. Major ventures in this route are crafted in higher education institutions where different outreach programs and methods such as lenient entry requirements, monitory incentives, learning skills workshops, tutoring and mentoring, are utilized. Although there is some information regarding these programs, their effectiveness still needs to be thoroughly examined. The Ethiopian community In Israel is one of the minority groups that has been targeted by sponsoring foundations and higher education institutions with the aim to ease the access, persistence and success of its young people in higher education and later in the job market. The evaluation study we propose to present focuses on the implementation of a program designed for this purpose. This program offers relevant candidates for study at a prestigious university a variety of generous incentives that include tuitions, livening allowance, tutoring, mentoring, skills and empowerment workshops and cultural meetings. Ten students were selected for the program and they started their studies in different subject areas before three and half years. A longitudinal evaluation has been conducted since the implementation of the program. Data were collected from different sources: participating students, program coordinator, mentors, tutors, program documents and university records. Questionnaires and interviews were used for collecting data on the different components of the program and on participants' perception of their effectiveness. Participants indicate that the lenient entry requirements and the monitory incentives are critical for starting their studies. During the first year, skills and empowering workshops, torturing and mentoring were evaluated as very important for persistence and success in studies. Tutoring was perceived as very important also at the second year but less importance is attributed to mentoring. Mixed results regarding integration in the Israeli culture emerged. The results are discussed with reference to findings from different settings around the world.

Keywords: access to higher education, minority groups, monitory incentives, torturing, mentoring

Procedia PDF Downloads 363
1177 Finite Volume Method for Flow Prediction Using Unstructured Meshes

Authors: Juhee Lee, Yongjun Lee

Abstract:

In designing a low-energy-consuming buildings, the heat transfer through a large glass or wall becomes critical. Multiple layers of the window glasses and walls are employed for the high insulation. The gravity driven air flow between window glasses or wall layers is a natural heat convection phenomenon being a key of the heat transfer. For the first step of the natural heat transfer analysis, in this study the development and application of a finite volume method for the numerical computation of viscous incompressible flows is presented. It will become a part of the natural convection analysis with high-order scheme, multi-grid method, and dual-time step in the future. A finite volume method based on a fully-implicit second-order is used to discretize and solve the fluid flow on unstructured grids composed of arbitrary-shaped cells. The integrations of the governing equation are discretised in the finite volume manner using a collocated arrangement of variables. The convergence of the SIMPLE segregated algorithm for the solution of the coupled nonlinear algebraic equations is accelerated by using a sparse matrix solver such as BiCGSTAB. The method used in the present study is verified by applying it to some flows for which either the numerical solution is known or the solution can be obtained using another numerical technique available in the other researches. The accuracy of the method is assessed through the grid refinement.

Keywords: finite volume method, fluid flow, laminar flow, unstructured grid

Procedia PDF Downloads 271
1176 Automatic Registration of Rail Profile Based Local Maximum Curvature Entropy

Authors: Hao Wang, Shengchun Wang, Weidong Wang

Abstract:

On the influence of train vibration and environmental noise on the measurement of track wear, we proposed a method for automatic extraction of circular arc on the inner or outer side of the rail waist and achieved the high-precision registration of rail profile. Firstly, a polynomial fitting method based on truncated residual histogram was proposed to find the optimal fitting curve of the profile and reduce the influence of noise on profile curve fitting. Then, based on the curvature distribution characteristics of the fitting curve, the interval search algorithm based on dynamic window’s maximum curvature entropy was proposed to realize the automatic segmentation of small circular arc. At last, we fit two circle centers as matching reference points based on small circular arcs on both sides and realized the alignment from the measured profile to the standard designed profile. The static experimental results show that the mean and standard deviation of the method are controlled within 0.01mm with small measurement errors and high repeatability. The dynamic test also verified the repeatability of the method in the train-running environment, and the dynamic measurement deviation of rail wear is within 0.2mm with high repeatability.

Keywords: curvature entropy, profile registration, rail wear, structured light, train-running

Procedia PDF Downloads 247
1175 To Identify the Importance of Telemedicine in Diabetes and Its Impact on Hba1c

Authors: Sania Bashir

Abstract:

A promising approach to healthcare delivery, telemedicine makes use of communication technology to reach out to remote regions of the world, allowing for beneficial interactions between diabetic patients and healthcare professionals as well as the provision of affordable and easily accessible medical care. The emergence of contemporary care models, fueled by the pervasiveness of mobile devices, provides better information, offers low cost with the best possible outcomes, and is known as digital health. It involves the integration of collected data using software and apps, as well as low-cost, high-quality outcomes. The goal of this study is to assess how well telemedicine works for diabetic patients and how it impacts their HbA1c levels. A questionnaire-based survey of 300 diabetics included 150 patients in each of the groups receiving usual care and via telemedicine. A descriptive and observational study that lasted from September 2021 to May 2022 was conducted. HbA1c has been gathered for both categories every three months. A remote monitoring tool has been used to assess the efficacy of telemedicine and continuing therapy instead of the customary three monthly meetings like in-person consultations. The patients were (42.3) 18.3 years old on average. 128 men were outnumbered by 172 women (57.3% of the total). 200 patients (66.6%) have type 2 diabetes, compared to over 100 (33.3%) candidates for type 1. Despite the average baseline BMI being within normal ranges at 23.4 kg/m², the mean baseline HbA1c (9.45 1.20) indicates that glycemic treatment is not well-controlled at the time of registration. While patients who use telemedicine experienced a mean percentage change of 10.5, those who visit the clinic experienced a mean percentage change of 3.9. Changes in HbA1c are dependent on several factors, including improvements in BMI (61%) after 9 months of research and compliance with healthy lifestyle recommendations for diet and activity. More compliance was achieved by the telemedicine group. It is an undeniable reality that patient-physician communication is crucial for enhancing health outcomes and avoiding long-term issues. Telemedicine has shown its value in the management of diabetes and holds promise as a novel technique for improved clinical-patient communication in the twenty-first century.

Keywords: diabetes, digital health, mobile app, telemedicine

Procedia PDF Downloads 73
1174 ANOVA-Based Feature Selection and Machine Learning System for IoT Anomaly Detection

Authors: Muhammad Ali

Abstract:

Cyber-attacks and anomaly detection on the Internet of Things (IoT) infrastructure is emerging concern in the domain of data-driven intrusion. Rapidly increasing IoT risk is now making headlines around the world. denial of service, malicious control, data type probing, malicious operation, DDos, scan, spying, and wrong setup are attacks and anomalies that can affect an IoT system failure. Everyone talks about cyber security, connectivity, smart devices, and real-time data extraction. IoT devices expose a wide variety of new cyber security attack vectors in network traffic. For further than IoT development, and mainly for smart and IoT applications, there is a necessity for intelligent processing and analysis of data. So, our approach is too secure. We train several machine learning models that have been compared to accurately predicting attacks and anomalies on IoT systems, considering IoT applications, with ANOVA-based feature selection with fewer prediction models to evaluate network traffic to help prevent IoT devices. The machine learning (ML) algorithms that have been used here are KNN, SVM, NB, D.T., and R.F., with the most satisfactory test accuracy with fast detection. The evaluation of ML metrics includes precision, recall, F1 score, FPR, NPV, G.M., MCC, and AUC & ROC. The Random Forest algorithm achieved the best results with less prediction time, with an accuracy of 99.98%.

Keywords: machine learning, analysis of variance, Internet of Thing, network security, intrusion detection

Procedia PDF Downloads 102
1173 The Studies of the Impact of Biomimicry and Sustainability on Urban Design

Authors: Nourhane Mohamed El Haridi, Mostafa El Arabi, Zeyad El Sayad

Abstract:

Biomimicry is defined, by Benyus the natural sciences writer, as imitating or taking inspiration from nature’s forms and processes to solve human problems. Biomimicry is the conscious emulation of life’s genius. As the design community realizes the tremendous impact human constructions have on the world, environmental designers look to new approaches like biomimicry to advance sustainable design. Building leading the declaration made by biomimicry scientists that a full imitation of nature engages form, ecosystem, and process; this paper uses a logic approach to interpret human and environmental wholeness. Designers would benefit from both integrating social theory with environmental thinking and from combining their substantive skills with techniques for getting sustainable biomimic urban design. Integrating biomimicryʹs “Life’s Principles” into a built environment process model will make biomimicry more accessible and thus more widely accepted throughout the industry, and the sustainability of all species will benefit. The Biomimicry Guild hypothesizes the incorporation of these principles, called Lifeʹs Principles, increase the likelihood of sustainability for a respective design, and make it more likely that the design will have a greater impact on sustainability for future generations of all species as mentioned by Benyus in her book. This thesis utilizes Life’s Principles as a foundation for a design process model intended for application on built environment projects at various scales. This paper takes a look at the importance of the integration of biomimicry in urban design to get more sustainable cities and better life, by analyzing the principles of both sustainability and biomimicry, and applying these ideas on futuristic or existing cities to make a biomimic sustainable city more healthier and more conductive to life, and get a better biomimic urban design. A group of experts, architects, biologists, scientists, economists and ecologists should work together to face all the financial and designing difficulties, to have better solutions and good innovative ideas for biomimic sustainable urban design, it is not the only solution, but it is one of the best studies for a better future.

Keywords: biomimicry, built environment, sustainability, urban design

Procedia PDF Downloads 508
1172 Identification of Hepatocellular Carcinoma Using Supervised Learning Algorithms

Authors: Sagri Sharma

Abstract:

Analysis of diseases integrating multi-factors increases the complexity of the problem and therefore, development of frameworks for the analysis of diseases is an issue that is currently a topic of intense research. Due to the inter-dependence of the various parameters, the use of traditional methodologies has not been very effective. Consequently, newer methodologies are being sought to deal with the problem. Supervised Learning Algorithms are commonly used for performing the prediction on previously unseen data. These algorithms are commonly used for applications in fields ranging from image analysis to protein structure and function prediction and they get trained using a known dataset to come up with a predictor model that generates reasonable predictions for the response to new data. Gene expression profiles generated by DNA analysis experiments can be quite complex since these experiments can involve hypotheses involving entire genomes. The application of well-known machine learning algorithm - Support Vector Machine - to analyze the expression levels of thousands of genes simultaneously in a timely, automated and cost effective way is thus used. The objectives to undertake the presented work are development of a methodology to identify genes relevant to Hepatocellular Carcinoma (HCC) from gene expression dataset utilizing supervised learning algorithms and statistical evaluations along with development of a predictive framework that can perform classification tasks on new, unseen data.

Keywords: artificial intelligence, biomarker, gene expression datasets, hepatocellular carcinoma, machine learning, supervised learning algorithms, support vector machine

Procedia PDF Downloads 417
1171 Information Management Approach in the Prediction of Acute Appendicitis

Authors: Ahmad Shahin, Walid Moudani, Ali Bekraki

Abstract:

This research aims at presenting a predictive data mining model to handle an accurate diagnosis of acute appendicitis with patients for the purpose of maximizing the health service quality, minimizing morbidity/mortality, and reducing cost. However, acute appendicitis is the most common disease which requires timely accurate diagnosis and needs surgical intervention. Although the treatment of acute appendicitis is simple and straightforward, its diagnosis is still difficult because no single sign, symptom, laboratory or image examination accurately confirms the diagnosis of acute appendicitis in all cases. This contributes in increasing morbidity and negative appendectomy. In this study, the authors propose to generate an accurate model in prediction of patients with acute appendicitis which is based, firstly, on the segmentation technique associated to ABC algorithm to segment the patients; secondly, on applying fuzzy logic to process the massive volume of heterogeneous and noisy data (age, sex, fever, white blood cell, neutrophilia, CRP, urine, ultrasound, CT, appendectomy, etc.) in order to express knowledge and analyze the relationships among data in a comprehensive manner; and thirdly, on applying dynamic programming technique to reduce the number of data attributes. The proposed model is evaluated based on a set of benchmark techniques and even on a set of benchmark classification problems of osteoporosis, diabetes and heart obtained from the UCI data and other data sources.

Keywords: healthcare management, acute appendicitis, data mining, classification, decision tree

Procedia PDF Downloads 335
1170 The Integration of Geographical Information Systems and Capacitated Vehicle Routing Problem with Simulated Demand for Humanitarian Logistics in Tsunami-Prone Area: A Case Study of Phuket, Thailand

Authors: Kiatkulchai Jitt-Aer, Graham Wall, Dylan Jones

Abstract:

As a result of the Indian Ocean tsunami in 2004, logistics applied to disaster relief operations has received great attention in the humanitarian sector. As learned from such disaster, preparing and responding to the aspect of delivering essential items from distribution centres to affected locations are of the importance for relief operations as the nature of disasters is uncertain especially in suffering figures, which are normally proportional to quantity of supplies. Thus, this study proposes a spatial decision support system (SDSS) for humanitarian logistics by integrating Geographical Information Systems (GIS) and the capacitated vehicle routing problem (CVRP). The GIS is utilised for acquiring demands simulated from the tsunami flooding model of the affected area in the first stage, and visualising the simulation solutions in the last stage. While CVRP in this study encompasses designing the relief routes of a set of homogeneous vehicles from a relief centre to a set of geographically distributed evacuation points in which their demands are estimated by using both simulation and randomisation techniques. The CVRP is modeled as a multi-objective optimization problem where both total travelling distance and total transport resources used are minimized, while demand-cost efficiency of each route is maximized in order to determine route priority. As the model is a NP-hard combinatorial optimization problem, the Clarke and Wright Saving heuristics is proposed to solve the problem for the near-optimal solutions. The real-case instances in the coastal area of Phuket, Thailand are studied to perform the SDSS that allows a decision maker to visually analyse the simulation scenarios through different decision factors.

Keywords: demand simulation, humanitarian logistics, geographical information systems, relief operations, capacitated vehicle routing problem

Procedia PDF Downloads 235
1169 A Gendered Perspective on the Influences of Transport Infrastructure on User Access

Authors: Ajeni Ari

Abstract:

In addressing gender and transport, considerations of mobility disparities amongst users are important. Public transport (PT) policy and design do not efficiently account for the varied mobility practices between men and women, with literature only recently showing a movement towards gender inclusion in transport. Arrantly, transport policy and designs remain gender-blind to the variation of mobility needs. The global movement towards sustainability highlights the need for expeditious strategies that could mitigate biases within the existing system. At the forefront of such plan of action may, in part, be mandated inclusive infrastructural designs that stimulate user engagement with the transport system. Fundamentally access requires a means or an opportunity to entity, which for PT is an establishment of its physical environment and/or infrastructural design. Its practicality may be utilised with knowledge of shortcomings in tangible or intangible aspects of the service offerings allowing access to opportunities. To inform on existing biases in PT planning and design, this study analyses qualitative data to examine the opinions and lived experiences among transport user in Ireland. Findings show that infrastructural design plays a significant role in users’ engagement with the service. Paramount to accessibility are service provisions that cater to both user interactions and those of their dependents. Apprehension to use the service is more so evident with women in comparison to men, particularly while carrying out household duties and caring responsibilities at peak times or dark hours. Furthermore, limitations are apparent with infrastructural service offerings that do not accommodate the physical (dis)ability of users, especially universal design. There are intersecting factors that impinge on accessibility, e.g., safety and security, yet essentially, infrastructural design is an important influencing parameter to user perceptual conditioning. Additionally, data discloses the need for user intricacies to be factored in transport planning geared towards gender inclusivity, including mobility practices, travel purpose, transit time or location, and system integration.

Keywords: public transport, accessibility, women, transport infrastructure

Procedia PDF Downloads 65
1168 Bayesian Inference for High Dimensional Dynamic Spatio-Temporal Models

Authors: Sofia M. Karadimitriou, Kostas Triantafyllopoulos, Timothy Heaton

Abstract:

Reduced dimension Dynamic Spatio-Temporal Models (DSTMs) jointly describe the spatial and temporal evolution of a function observed subject to noise. A basic state space model is adopted for the discrete temporal variation, while a continuous autoregressive structure describes the continuous spatial evolution. Application of such a DSTM relies upon the pre-selection of a suitable reduced set of basic functions and this can present a challenge in practice. In this talk, we propose an online estimation method for high dimensional spatio-temporal data based upon DSTM and we attempt to resolve this issue by allowing the basis to adapt to the observed data. Specifically, we present a wavelet decomposition in order to obtain a parsimonious approximation of the spatial continuous process. This parsimony can be achieved by placing a Laplace prior distribution on the wavelet coefficients. The aim of using the Laplace prior, is to filter wavelet coefficients with low contribution, and thus achieve the dimension reduction with significant computation savings. We then propose a Hierarchical Bayesian State Space model, for the estimation of which we offer an appropriate particle filter algorithm. The proposed methodology is illustrated using real environmental data.

Keywords: multidimensional Laplace prior, particle filtering, spatio-temporal modelling, wavelets

Procedia PDF Downloads 414
1167 Optimisation of B2C Supply Chain Resource Allocation

Authors: Firdaous Zair, Zoubir Elfelsoufi, Mohammed Fourka

Abstract:

The allocation of resources is an issue that is needed on the tactical and operational strategic plan. This work considers the allocation of resources in the case of pure players, manufacturers and Click & Mortars that have launched online sales. The aim is to improve the level of customer satisfaction and maintaining the benefits of e-retailer and of its cooperators and reducing costs and risks. Our contribution is a decision support system and tool for improving the allocation of resources in logistics chains e-commerce B2C context. We first modeled the B2C chain with all operations that integrates and possible scenarios since online retailers offer a wide selection of personalized service. The personalized services that online shopping companies offer to the clients can be embodied in many aspects, such as the customizations of payment, the distribution methods, and after-sales service choices. In addition, every aspect of customized service has several modes. At that time, we analyzed the optimization problems of supply chain resource allocation in customized online shopping service mode, which is different from the supply chain resource allocation under traditional manufacturing or service circumstances. Then we realized an optimization model and algorithm for the development based on the analysis of the allocation of the B2C supply chain resources. It is a multi-objective optimization that considers the collaboration of resources in operations, time and costs but also the risks and the quality of services as well as dynamic and uncertain characters related to the request.

Keywords: e-commerce, supply chain, B2C, optimisation, resource allocation

Procedia PDF Downloads 257
1166 Improving Public Sectors’ Policy Direction on Large Infrastructure Investment Projects: A Developmental Approach

Authors: Ncedo Cameron Xhala

Abstract:

Several public sector institutions lack policy direction on how to successfully implement their large infrastructure investment projects. It is significant to improve strategic policy direction in public sector institutions in order to improve planning, management and implementation of large infrastructure investment projects. It is significant to improve an understanding of internal and external pressures that exerts pressure on large infrastructure projects. The significance is to fulfill the public sector’s mandate, align the sectors’ scarce resources, stakeholders and to improve project management processes. The study used a case study approach which was underpinned by a constructionist approach. The study used a theoretical sampling technique when selecting study participants, and was followed by a snowball sampling technique that was used to select an identified case study project purposefully. The study was qualitative in nature, collected and analyzed qualitative empirical data from the purposefully selected five subject matter experts and has analyzed the case study documents. The study used a semi-structured interview approach, analysed case study documents in a qualitative approach. The interviews were on a face-to-face basis and were guided by an interview guide with focused questions. The study used a three coding process step comprising of one to three steps when analysing the qualitative empirical data. Findings reveal that an improvement of strategic policy direction in public sector institutions improves the integration in planning, management and on implementation on large infrastructure investment projects. Findings show the importance of understanding the external and internal pressures when implementing public sector’s large infrastructure investment projects. The study concludes that strategic policy direction in public sector institutions results in improvement of planning, financing, delivery, monitoring and evaluation and successful implementation of the public sector’s large infrastructure investment projects.

Keywords: implementation, infrastructure, investment, management

Procedia PDF Downloads 139
1165 Integration of Agile Philosophy and Scrum Framework to Missile System Design Processes

Authors: Misra Ayse Adsiz, Selim Selvi

Abstract:

In today's world, technology is competing with time. In order to catch up with the world's companies and adapt quickly to the changes, it is necessary to speed up the processes and keep pace with the rate of change of the technology. The missile system design processes, which are handled with classical methods, keep behind in this race. Because customer requirements are not clear, and demands are changing again and again in the design process. Therefore, in the system design process, a methodology suitable for the missile system design dynamics has been investigated and the processes used for catching up the era are examined. When commonly used design processes are analyzed, it is seen that any one of them is dynamic enough for today’s conditions. So a hybrid design process is established. After a detailed review of the existing processes, it is decided to focus on the Scrum Framework and Agile Philosophy. Scrum is a process framework. It is focused on to develop software and handling change management with rapid methods. In addition, agile philosophy is intended to respond quickly to changes. In this study, it is aimed to integrate Scrum framework and agile philosophy, which are the most appropriate ways for rapid production and change adaptation, into the missile system design process. With this approach, it is aimed that the design team, involved in the system design processes, is in communication with the customer and provide an iterative approach in change management. These methods, which are currently being used in the software industry, have been integrated with the product design process. A team is created for system design process. The roles of Scrum Team are realized with including the customer. A scrum team consists of the product owner, development team and scrum master. Scrum events, which are short, purposeful and time-limited, are organized to serve for coordination rather than long meetings. Instead of the classic system design methods used in product development studies, a missile design is made with this blended method. With the help of this design approach, it is become easier to anticipate changing customer demands, produce quick solutions to demands and combat uncertainties in the product development process. With the feedback of the customer who included in the process, it is worked towards marketing optimization, design and financial optimization.

Keywords: agile, design, missile, scrum

Procedia PDF Downloads 153
1164 Multi-Level Air Quality Classification in China Using Information Gain and Support Vector Machine

Authors: Bingchun Liu, Pei-Chann Chang, Natasha Huang, Dun Li

Abstract:

Machine Learning and Data Mining are the two important tools for extracting useful information and knowledge from large datasets. In machine learning, classification is a wildly used technique to predict qualitative variables and is generally preferred over regression from an operational point of view. Due to the enormous increase in air pollution in various countries especially China, Air Quality Classification has become one of the most important topics in air quality research and modelling. This study aims at introducing a hybrid classification model based on information theory and Support Vector Machine (SVM) using the air quality data of four cities in China namely Beijing, Guangzhou, Shanghai and Tianjin from Jan 1, 2014 to April 30, 2016. China's Ministry of Environmental Protection has classified the daily air quality into 6 levels namely Serious Pollution, Severe Pollution, Moderate Pollution, Light Pollution, Good and Excellent based on their respective Air Quality Index (AQI) values. Using the information theory, information gain (IG) is calculated and feature selection is done for both categorical features and continuous numeric features. Then SVM Machine Learning algorithm is implemented on the selected features with cross-validation. The final evaluation reveals that the IG and SVM hybrid model performs better than SVM (alone), Artificial Neural Network (ANN) and K-Nearest Neighbours (KNN) models in terms of accuracy as well as complexity.

Keywords: machine learning, air quality classification, air quality index, information gain, support vector machine, cross-validation

Procedia PDF Downloads 220
1163 Breast Cancer Survivability Prediction via Classifier Ensemble

Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia

Abstract:

This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.

Keywords: classifier ensemble, breast cancer survivability, data mining, SEER

Procedia PDF Downloads 310
1162 Organic Thin-Film Transistors with High Thermal Stability

Authors: Sibani Bisoyi, Ute Zschieschang, Alexander Hoyer, Hagen Klauk

Abstract:

Abstract— Organic thin-film transistors (TFTs) have great potential to be used for various applications such as flexible displays or sensors. For some of these applications, the TFTs must be able to withstand temperatures in excess of 100 °C, for example to permit the integration with devices or components that require high process temperatures, or to make it possible that the devices can be subjected to the standard sterilization protocols required for biomedical applications. In this work, we have investigated how the thermal stability of low-voltage small-molecule semiconductor dinaphtho[2,3-b:2’,3’-f]thieno[3,2-b]thiophene (DNTT) TFTs is affected by the encapsulation of the TFTs and by the ambient in which the thermal stress is performed. We also studied to which extent the thermal stability of the TFTs depends on the channel length. Some of the TFTs were encapsulated with a layer of vacuum-deposited Teflon, while others were left without encapsulation, and the thermal stress was performed either in nitrogen or in air. We found that the encapsulation with Teflon has virtually no effect on the thermal stability of our TFTs. In contrast, the ambient in which the thermal stress is conducted was found to have a measurable effect, but in a surprising way: When the thermal stress is carried out in nitrogen, the mobility drops to 70% of its initial value at a temperature of 160 °C and to close to zero at 170 °C, whereas when the stress is performed in air, the mobility remains at 75% of its initial value up to a temperature of 160 °C and at 60% up to 180 °C. To understand this behavior, we studied the effect of the thermal stress on the semiconductor thin-film morphology by scanning electron microscopy. While the DNTT films remain continuous and conducting when the heating is carried out in air, the semiconductor morphology undergoes a dramatic change, including the formation of large, thick crystals of DNTT and a complete loss of percolation, when the heating is conducted in nitrogen. We also found that when the TFTs are heated to a temperature of 200 °C in air, all TFTs with a channel length greater than 50 µm are destroyed, while TFTs with a channel length of less than 50 µm survive, whereas when the TFTs are heated to the same temperature (200 °C) in nitrogen, only the TFTs with a channel smaller than 8 µm survive. This result is also linked to the thermally induced changes in the semiconductor morphology.

Keywords: organic thin-film transistors, encapsulation, thermal stability, thin-film morphology

Procedia PDF Downloads 332
1161 Alternative Ways of Knowing and the Construction of a Department Around a Common Critical Lens

Authors: Natalie Delia

Abstract:

This academic paper investigates the transformative potential of incorporating alternative ways of knowing within the framework of Critical Studies departments. Traditional academic paradigms often prioritize empirical evidence and established methodologies, potentially limiting the scope of critical inquiry. In response to this, our research seeks to illuminate the benefits and challenges associated with integrating alternative epistemologies, such as indigenous knowledge systems, artistic expressions, and experiential narratives. Drawing upon a comprehensive review of literature and case studies, we examine how alternative ways of knowing can enrich and diversify the intellectual landscape of Critical Studies departments. By embracing perspectives that extend beyond conventional boundaries, departments may foster a more inclusive and holistic understanding of critical issues. Additionally, we explore the potential impact on pedagogical approaches, suggesting that alternative ways of knowing can stimulate alternative way of teaching methods and enhance student engagement. Our investigation also delves into the institutional and cultural shifts necessary to support the integration of alternative epistemologies within academic settings. We address concerns related to validation, legitimacy, and the potential clash with established norms, offering insights into fostering an environment that encourages intellectual pluralism. Furthermore, the paper considers the implications for interdisciplinary collaboration and the potential for cultivating a more responsive and socially engaged scholarship. By encouraging a synthesis of diverse perspectives, Critical Studies departments may be better equipped to address the complexities of contemporary issues, encouraging a dynamic and evolving field of study. In conclusion, this paper advocates for a paradigm shift within Critical Studies departments towards a more inclusive and expansive approach to knowledge production. By embracing alternative ways of knowing, departments have the opportunity to not only diversify their intellectual landscape but also to contribute meaningfully to broader societal dialogues, addressing pressing issues with renewed depth and insight.

Keywords: critical studies, alternative ways of knowing, academic department, Wallerstein

Procedia PDF Downloads 49