Search results for: inquiry- based instruction
22903 Predicting Provider Service Time in Outpatient Clinics Using Artificial Intelligence-Based Models
Authors: Haya Salah, Srinivas Sharan
Abstract:
Healthcare facilities use appointment systems to schedule their appointments and to manage access to their medical services. With the growing demand for outpatient care, it is now imperative to manage physician's time effectively. However, high variation in consultation duration affects the clinical scheduler's ability to estimate the appointment duration and allocate provider time appropriately. Underestimating consultation times can lead to physician's burnout, misdiagnosis, and patient dissatisfaction. On the other hand, appointment durations that are longer than required lead to doctor idle time and fewer patient visits. Therefore, a good estimation of consultation duration has the potential to improve timely access to care, resource utilization, quality of care, and patient satisfaction. Although the literature on factors influencing consultation length abound, little work has done to predict it using based data-driven approaches. Therefore, this study aims to predict consultation duration using supervised machine learning algorithms (ML), which predicts an outcome variable (e.g., consultation) based on potential features that influence the outcome. In particular, ML algorithms learn from a historical dataset without explicitly being programmed and uncover the relationship between the features and outcome variable. A subset of the data used in this study has been obtained from the electronic medical records (EMR) of four different outpatient clinics located in central Pennsylvania, USA. Also, publicly available information on doctor's characteristics such as gender and experience has been extracted from online sources. This research develops three popular ML algorithms (deep learning, random forest, gradient boosting machine) to predict the treatment time required for a patient and conducts a comparative analysis of these algorithms with respect to predictive performance. The findings of this study indicate that ML algorithms have the potential to predict the provider service time with superior accuracy. While the current approach of experience-based appointment duration estimation adopted by the clinic resulted in a mean absolute percentage error of 25.8%, the Deep learning algorithm developed in this study yielded the best performance with a MAPE of 12.24%, followed by gradient boosting machine (13.26%) and random forests (14.71%). Besides, this research also identified the critical variables affecting consultation duration to be patient type (new vs. established), doctor's experience, zip code, appointment day, and doctor's specialty. Moreover, several practical insights are obtained based on the comparative analysis of the ML algorithms. The machine learning approach presented in this study can serve as a decision support tool and could be integrated into the appointment system for effectively managing patient scheduling.Keywords: clinical decision support system, machine learning algorithms, patient scheduling, prediction models, provider service time
Procedia PDF Downloads 12122902 Automated Building Internal Layout Design Incorporating Post-Earthquake Evacuation Considerations
Authors: Sajjad Hassanpour, Vicente A. González, Yang Zou, Jiamou Liu
Abstract:
Earthquakes pose a significant threat to both structural and non-structural elements in buildings, putting human lives at risk. Effective post-earthquake evacuation is critical for ensuring the safety of building occupants. However, current design practices often neglect the integration of post-earthquake evacuation considerations into the early-stage architectural design process. To address this gap, this paper presents a novel automated internal architectural layout generation tool that optimizes post-earthquake evacuation performance. The tool takes an initial plain floor plan as input, along with specific requirements from the user/architect, such as minimum room dimensions, corridor width, and exit lengths. Based on these inputs, firstly, the tool randomly generates different architectural layouts. Secondly, the human post-earthquake evacuation behaviour will be thoroughly assessed for each generated layout using the advanced Agent-Based Building Earthquake Evacuation Simulation (AB2E2S) model. The AB2E2S prototype is a post-earthquake evacuation simulation tool that incorporates variables related to earthquake intensity, architectural layout, and human factors. It leverages a hierarchical agent-based simulation approach, incorporating reinforcement learning to mimic human behaviour during evacuation. The model evaluates different layout options and provides feedback on evacuation flow, time, and possible casualties due to earthquake non-structural damage. By integrating the AB2E2S model into the automated layout generation tool, architects and designers can obtain optimized architectural layouts that prioritize post-earthquake evacuation performance. Through the use of the tool, architects and designers can explore various design alternatives, considering different minimum room requirements, corridor widths, and exit lengths. This approach ensures that evacuation considerations are embedded in the early stages of the design process. In conclusion, this research presents an innovative automated internal architectural layout generation tool that integrates post-earthquake evacuation simulation. By incorporating evacuation considerations into the early-stage design process, architects and designers can optimize building layouts for improved post-earthquake evacuation performance. This tool empowers professionals to create resilient designs that prioritize the safety of building occupants in the face of seismic events.Keywords: agent-based simulation, automation in design, architectural layout, post-earthquake evacuation behavior
Procedia PDF Downloads 10422901 Probabilistic Modeling of Post-Liquefaction Ground Deformation
Authors: Javad Sadoghi Yazdi, Robb Eric S. Moss
Abstract:
This paper utilizes a probabilistic liquefaction triggering method for modeling post-liquefaction ground deformation. This cone penetration test CPT-based liquefaction triggering is employed to estimate the factor of safety against liquefaction (FSL) and compute the maximum cyclic shear strain (γmax). The study identifies a maximum PL value of 90% across various relative densities, which challenges the decrease from 90% to 70% as relative density decreases. It reveals that PL ranges from 5% to 50% for volumetric strain (εvol) less than 1%, while for εvol values between 1% and 3.2%, PL spans from 50% to 90%. The application of the CPT-based simplified liquefaction triggering procedures has been employed in previous researches to estimate liquefaction ground-failure indices, such as the Liquefaction Potential Index (LPI) and Liquefaction Severity Number (LSN). However, several studies have been conducted to highlight the variability in liquefaction probability calculations, suggesting a more accurate depiction of liquefaction likelihood. Consequently, the utilization of these simplified methods may not offer practical efficiency. This paper further investigates the efficacy of various established liquefaction vulnerability parameters, including LPI and LSN, in explaining the observed liquefaction-induced damage within residential zones of Christchurch, New Zealand using results from CPT database.Keywords: cone penetration test (CPT), liquefaction, postliquefaction, ground failure
Procedia PDF Downloads 7122900 The Assessment of the Comparative Efficiency of Reforms through the Integral Index of Transformation
Authors: Samson Davoyan, Ashot Davoyan, Ani Khachatryan
Abstract:
The indexes (Global Competitiveness Index, Economic Freedom Index, Human Development Index, etc.) developed by different international and non-government organizations in time and space express the quantitative and qualitative features of different fields of various reforms implemented in different countries. The main objective of our research is to develop new methodology that we will use to create integral index based on many indexes and that will include many areas of reforms. To achieve our aim we have used econometric methods (regression model for panel data method). The basis of our methodology is the development of the new integral index based on quantitative assessment of the change of two main parameters: the score of the countries by different indexes and the change of the ranks of countries for following two periods of time. As a result of the usage of methods for analyzes we have defined the indexes that are used to create the new integral index and the scales for each of them. Analyzing quantitatively and qualitatively analysis through the integral index for more than 100 countries for 2009-2014, we have defined comparative efficiency that helps to conclude in which directions countries have implemented reforms more effectively compared to others and in which direction reforms have implemented less efficiently.Keywords: development, rank, reforms, comparative, index, economic, corruption, social, program
Procedia PDF Downloads 32622899 Embedded Acoustic Signal Processing System Using OpenMP Architecture
Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad
Abstract:
In this paper, altera de1-SoC FPGA board technology is utilized as a distinguished tool for nondestructive characterization of an aluminum circular cylindrical shell of radius ratio b/a (a: outer radius; b: inner radius). The acoustic backscattered signal processing system has been developed using OpenMP architecture. The design is built in three blocks; it is implemented per functional block, in a heterogeneous Intel-Altera system running under Linux. The useful data to determine the performances of SoC FPGA is computed by the analytical method. The exploitation of SoC FPGA has lead to obtain the backscattering form function and resonance spectra. A0 and S0 modes of propagation in the tube are shown. The findings are then compared to those achieved from the Matlab simulation of analytical method. A good agreement has, therefore, been noted. Moreover, the detailed SoC FPGA-based system has shown that acoustic spectra are performed at up to 5 times faster than the Matlab implementation using almost the same data. This FPGA-based system implementation of processing algorithms is realized with a coefficient of correlation R and absolute error respectively about 0.962 and 5 10⁻⁵.Keywords: OpenMP, signal processing system, acoustic backscattering, nondestructive characterization, thin tubes
Procedia PDF Downloads 9222898 A Distributed Mobile Agent Based on Intrusion Detection System for MANET
Authors: Maad Kamal Al-Anni
Abstract:
This study is about an algorithmic dependence of Artificial Neural Network on Multilayer Perceptron (MPL) pertaining to the classification and clustering presentations for Mobile Adhoc Network vulnerabilities. Moreover, mobile ad hoc network (MANET) is ubiquitous intelligent internetworking devices in which it has the ability to detect their environment using an autonomous system of mobile nodes that are connected via wireless links. Security affairs are the most important subject in MANET due to the easy penetrative scenarios occurred in such an auto configuration network. One of the powerful techniques used for inspecting the network packets is Intrusion Detection System (IDS); in this article, we are going to show the effectiveness of artificial neural networks used as a machine learning along with stochastic approach (information gain) to classify the malicious behaviors in simulated network with respect to different IDS techniques. The monitoring agent is responsible for detection inference engine, the audit data is collected from collecting agent by simulating the node attack and contrasted outputs with normal behaviors of the framework, whenever. In the event that there is any deviation from the ordinary behaviors then the monitoring agent is considered this event as an attack , in this article we are going to demonstrate the signature-based IDS approach in a MANET by implementing the back propagation algorithm over ensemble-based Traffic Table (TT), thus the signature of malicious behaviors or undesirable activities are often significantly prognosticated and efficiently figured out, by increasing the parametric set-up of Back propagation algorithm during the experimental results which empirically shown its effectiveness for the ratio of detection index up to 98.6 percentage. Consequently it is proved in empirical results in this article, the performance matrices are also being included in this article with Xgraph screen show by different through puts like Packet Delivery Ratio (PDR), Through Put(TP), and Average Delay(AD).Keywords: Intrusion Detection System (IDS), Mobile Adhoc Networks (MANET), Back Propagation Algorithm (BPA), Neural Networks (NN)
Procedia PDF Downloads 19422897 Ionic Liquid Membranes for CO2 Separation
Authors: Zuzana Sedláková, Magda Kárászová, Jiří Vejražka, Lenka Morávková, Pavel Izák
Abstract:
Membrane separations are mentioned frequently as a possibility for CO2 capture. Selectivity of ionic liquid membranes is strongly determined by different solubility of separated gases in ionic liquids. The solubility of separated gases usually varies over an order of magnitude, differently from diffusivity of gases in ionic liquids, which is usually of the same order of magnitude for different gases. The present work evaluates the selection of an appropriate ionic liquid for the selective membrane preparation based on the gas solubility in an ionic liquid. The current state of the art of CO2 capture patents and technologies based on the membrane separations was considered. An overview is given of the discussed transport mechanisms. Ionic liquids seem to be promising candidates thanks to their tunable properties, wide liquid range, reasonable thermal stability, and negligible vapor pressure. However, the uses of supported liquid membranes are limited by their relatively short lifetime from the industrial point of view. On the other hand, ionic liquids could overcome these problems due to their negligible vapor pressure and their tunable properties by adequate selection of the cation and anion.Keywords: biogas upgrading, carbon dioxide separation, ionic liquid membrane, transport properties
Procedia PDF Downloads 43122896 Monitoring and Evaluation in Community-Based Tourism: An Analysis and Model
Authors: Ivan Gunass Govender, Andrea Giampiccoli
Abstract:
A developmental state should use community engagement to facilitate socio-economic development for disadvantaged groups and individual members of society through empowerment, social justice, sustainability, and self-reliance. In this regard, community-based tourism (CBT) as a growing market should be an indigenous effort aided by external facilitation. Since this form of tourism presents its own preconditions, characteristics, and challenges, it could be guided by higher education institutions engagement. In particular, the facilitation should not only serve to assist the community members to reach their own goals; but rather also focus on learning through knowledge creation and sharing with the engagement of higher education institutions. While the increased relevance of CBT has produced various CBT manuals (or handbooks/guidelines) documents aimed to ‘teach’ and assist various entities in CBT development, this research aims to analyse the current monitoring & evaluation (M&E) manuals and thereafter, propose an M&E model for CBT. It is important to mention that all too often effective monitoring is seldom carried out thus risking the long-term sustainability and improvement of the CBT ventures. Therefore, the proposed model will also consider some inputs external to the tourism field, but in relation to local economic development (LED) matters from the previously proposed development monitoring and evaluation system framework. M&E should be seen as fundamental components of any CBT initiative, and the whole CBT intervention should be evaluated. In this context, M&E in CBT should go beyond strict ‘numerical’ economic matters and should be understood in a holistic development. In addition, M&E in CBT should not consider issues in various ‘compartments’ such as tourists, tourism attractions, CBT owners/participants, and stakeholder engagement but as interdependent components of a macro-ecosystem. Finally, the external facilitation process should be structured in a way to promote community self-reliance in both the intervention and the M&E process. The research will attempt to propose an M&E model for CBT so as to enhance the CBT possibilities of long-term growth and success through effective collaborations with key stakeholders.Keywords: community-based tourism, community-engagement, monitoring and evaluation, stakeholders
Procedia PDF Downloads 30422895 Fuzzy Multi-Criteria Decision-Making Based on Ignatian Discernment Process
Authors: Pathinathan Theresanathan, Ajay Minj
Abstract:
Ignatian Discernment Process (IDP) is an intense decision-making tool to decide on life-issues. Decisions are influenced by various factors outside of the decision maker and inclination within. This paper develops IDP in the context of Fuzzy Multi-criteria Decision Making (FMCDM) process. Extended VIKOR method is a decision-making method which encompasses even conflict situations and accommodates weightage to various issues. Various aspects of IDP, namely three ways of decision making and tactics of inner desires, are observed, analyzed and articulated within the frame work of fuzzy rules. The decision-making situations are broadly categorized into two types. The issues outside of the decision maker influence the person. The inner feeling also plays vital role in coming to a conclusion. IDP integrates both the categories using Extended VIKOR method. Case studies are carried out and analyzed with FMCDM process. Finally, IDP is verified with an illustrative case study and results are interpreted. A confused person who could not come to a conclusion is able to take decision on a concrete way of life through IDP. The proposed IDP model recommends an integrated and committed approach to value-based decision making.Keywords: AHP, FMCDM, IDP, ignatian discernment, MCDM, VIKOR
Procedia PDF Downloads 26022894 An Improved Two-dimensional Ordered Statistical Constant False Alarm Detection
Authors: Weihao Wang, Zhulin Zong
Abstract:
Two-dimensional ordered statistical constant false alarm detection is a widely used method for detecting weak target signals in radar signal processing applications. The method is based on analyzing the statistical characteristics of the noise and clutter present in the radar signal and then using this information to set an appropriate detection threshold. In this approach, the reference cell of the unit to be detected is divided into several reference subunits. These subunits are used to estimate the noise level and adjust the detection threshold, with the aim of minimizing the false alarm rate. By using an ordered statistical approach, the method is able to effectively suppress the influence of clutter and noise, resulting in a low false alarm rate. The detection process involves a number of steps, including filtering the input radar signal to remove any noise or clutter, estimating the noise level based on the statistical characteristics of the reference subunits, and finally, setting the detection threshold based on the estimated noise level. One of the main advantages of two-dimensional ordered statistical constant false alarm detection is its ability to detect weak target signals in the presence of strong clutter and noise. This is achieved by carefully analyzing the statistical properties of the signal and using an ordered statistical approach to estimate the noise level and adjust the detection threshold. In conclusion, two-dimensional ordered statistical constant false alarm detection is a powerful technique for detecting weak target signals in radar signal processing applications. By dividing the reference cell into several subunits and using an ordered statistical approach to estimate the noise level and adjust the detection threshold, this method is able to effectively suppress the influence of clutter and noise and maintain a low false alarm rate.Keywords: two-dimensional, ordered statistical, constant false alarm, detection, weak target signals
Procedia PDF Downloads 7822893 Lexical Semantic Analysis to Support Ontology Modeling of Maintenance Activities– Case Study of Offshore Riser Integrity
Authors: Vahid Ebrahimipour
Abstract:
Word representation and context meaning of text-based documents play an essential role in knowledge modeling. Business procedures written in natural language are meant to store technical and engineering information, management decision and operation experience during the production system life cycle. Context meaning representation is highly dependent upon word sense, lexical relativity, and sematic features of the argument. This paper proposes a method for lexical semantic analysis and context meaning representation of maintenance activity in a mass production system. Our approach constructs a straightforward lexical semantic approach to analyze facilitates semantic and syntactic features of context structure of maintenance report to facilitate translation, interpretation, and conversion of human-readable interpretation into computer-readable representation and understandable with less heterogeneity and ambiguity. The methodology will enable users to obtain a representation format that maximizes shareability and accessibility for multi-purpose usage. It provides a contextualized structure to obtain a generic context model that can be utilized during the system life cycle. At first, it employs a co-occurrence-based clustering framework to recognize a group of highly frequent contextual features that correspond to a maintenance report text. Then the keywords are identified for syntactic and semantic extraction analysis. The analysis exercises causality-driven logic of keywords’ senses to divulge the structural and meaning dependency relationships between the words in a context. The output is a word contextualized representation of maintenance activity accommodating computer-based representation and inference using OWL/RDF.Keywords: lexical semantic analysis, metadata modeling, contextual meaning extraction, ontology modeling, knowledge representation
Procedia PDF Downloads 10522892 Predicting Medical Check-Up Patient Re-Coming Using Sequential Pattern Mining and Association Rules
Authors: Rizka Aisha Rahmi Hariadi, Chao Ou-Yang, Han-Cheng Wang, Rajesri Govindaraju
Abstract:
As the increasing of medical check-up popularity, there are a huge number of medical check-up data stored in database and have not been useful. These data actually can be very useful for future strategic planning if we mine it correctly. In other side, a lot of patients come with unpredictable coming and also limited available facilities make medical check-up service offered by hospital not maximal. To solve that problem, this study used those medical check-up data to predict patient re-coming. Sequential pattern mining (SPM) and association rules method were chosen because these methods are suitable for predicting patient re-coming using sequential data. First, based on patient personal information the data was grouped into … groups then discriminant analysis was done to check significant of the grouping. Second, for each group some frequent patterns were generated using SPM method. Third, based on frequent patterns of each group, pairs of variable can be extracted using association rules to get general pattern of re-coming patient. Last, discussion and conclusion was done to give some implications of the results.Keywords: patient re-coming, medical check-up, health examination, data mining, sequential pattern mining, association rules, discriminant analysis
Procedia PDF Downloads 64022891 Vulnerability Risk Assessment of Non-Engineered Houses Based on Damage Data of the 2009 Padang Earthquake 2009 in Padang City, Indonesia
Authors: Rusnardi Rahmat Putra, Junji Kiyono, Aiko Furukawa
Abstract:
Several powerful earthquakes have struck Padang during recent years, one of the largest of which was an M 7.6 event that occurred on September 30, 2009 and caused more than 1000 casualties. Following the event, we conducted a 12-site microtremor array investigation to gain a representative determination of the soil condition of subsurface structures in Padang. From the dispersion curve of array observations, the central business district of Padang corresponds to relatively soft soil condition with Vs30 less than 400 m/s. because only one accelerometer existed, we simulated the 2009 Padang earthquake to obtain peak ground acceleration for all sites in Padang city. By considering the damage data of the 2009 Padang earthquake, we produced seismic risk vulnerability estimation of non-engineered houses for rock, medium and soft soil condition. We estimated the loss ratio based on the ground response, seismic hazard of Padang and the existing damaged to non-engineered structure houses due to Padang earthquake in 2009 data for several return periods of earthquake events.Keywords: profile, Padang earthquake, microtremor array, seismic vulnerability
Procedia PDF Downloads 41022890 Impact of Exogenous Risk Factors into Actual Construction Price in PPP Projects
Authors: Saleh Alzahrani, Halim Boussabaine
Abstract:
Many of Public Private Partnership (PPP) are developed based on a public project is to be awarded to a private party within a one contractual framework. PPP project risks typically include the development and construction of a new asset as well as its operation. Certainly the most severe consequences of risks through the construction period are price and time overruns. These events are among the most generally used situation in value for money analysis risks. The sources of risk change during the time in PPP project. In traditional procurement, the public sector usually has to cover all prices suffering from these risks. At least there is plenty to suggest that price suffering is a norm in some of the projects that are delivered under traditional procurement. This paper will find the impact of exogenous risk factors into actual construction price into PPP projects. The paper will present a brief literature review on PPP risk pricing strategies and then using system dynamics (SD) to analyses of the risks associated with the estimated project price. Based on the finding from these analyses a risk pricing association model is presented and discussed. The paper concludes with thoughts for future research.Keywords: public private partnership (PPP), risk, risk pricing, system dynamics (SD)
Procedia PDF Downloads 55722889 Carbonation and Mechanical Performance of Reactive Magnesia Based Formulations
Authors: Cise Unluer
Abstract:
Reactive MgO hydrates to form brucite (Mg(OH)2, magnesium hydroxide), which can then react with CO2 and additional water to form a range of strength providing hydrated magnesium carbonates (HMCs) within cement-based formulations. The presented work focuses on the use of reactive MgO in a range of concrete mixes, where it carbonates by absorbing CO2 and gains strength accordingly. The main goal involves maximizing the amount of CO2 absorbed within construction products, thereby reducing the overall environmental impact of the designed formulations. Microstructural analyses including scanning electron microscopy (SEM), X-ray diffraction (XRD) and thermogravimetry/differential thermal analysis (TG/DTA) are used in addition to porosity, permeability and unconfined compressive strength (UCS) testing to understand the performance mechanisms. XRD Reference Intensity Ratio (RIR), acid digestion and TG/DTA are utilized to quantify the amount of CO2 sequestered, with the goal of achieving 100% carbonation through careful mix design, leading to a range of carbon neutral products with high strengths. As a result, samples stronger than those containing Portland cement (PC) were produced, revealing the link between the mechanical performance and microstructural development of the developed formulations with the amount of CO2 sequestered.Keywords: carbonation, compressive strength, reactive MgO cement, sustainability
Procedia PDF Downloads 18022888 Feasibility of Implementing Zero Energy Buildings in Iran and Examining Its Economic and Technical Aspects
Authors: Maryam Siyami
Abstract:
Zero energy buildings refer to buildings that have zero annual energy consumption and do not produce carbon emissions. In today's world, considering the limited resources of fossil fuels, buildings, industries and other organizations have moved towards using other available energies. The idea and principle of net zero energy consumption has attracted a lot of attention because the use of renewable energy is a means and a solution to eliminate pollutants and greenhouse gases. Due to the increase in the cost of fossil fuels and their destructive effects on the environment and disrupting the ecological balance, today the plans related to zero energy principles have become very practical and have gained particular popularity. In this research, building modeling has been done in the Design Builder software environment. Based on the changes in the required energy throughout the year in different roof thickness conditions, it has been observed that with the increase in roof thickness, the amount of heating energy required has a downward trend, from 6730 kilowatt hours in the roof thickness of 10 cm to 6408 kilowatt hours in the roof thickness condition. 20 cm is reached, which represents a reduction of about 4.7% in energy if the roof thickness is doubled. Also, with the increase in the thickness of the roof throughout the year, the amount of cooling energy required has a gentle downward trend and has reached from 4964 kilowatt hours in the case of a roof thickness of 10 cm to 4859 kilowatt hours in the case of a roof thickness of 20 cm, which is a decrease equal to It displays 2%. It can be seen that the trend of changes in the energy required for cooling and heating is not much affected by the thickness of the roof (with an effect of 98%) and therefore there is no technical and economic recommendation to increase the thickness of the roof in this sector. Finally, based on the changes in the carbon dioxide produced in different states of the roof thickness, it has been observed that with the increase in the roof thickness, energy consumption and consequently the production of carbon dioxide has decreased. By increasing the thickness of the roof from 10 cm to 20 cm, the amount of carbon dioxide produced by heating the building has decreased by 27%. Also, this amount of reduction has been obtained based on the cooling system and for different amounts of roof thickness equal to 19%.Keywords: energy consumption, green building, design builder, AHP
Procedia PDF Downloads 2522887 Implementing a Neural Network on a Low-Power and Mobile Cluster to Aide Drivers with Predictive AI for Traffic Behavior
Authors: Christopher Lama, Alix Rieser, Aleksandra Molchanova, Charles Thangaraj
Abstract:
New technologies like Tesla’s Dojo have made high-performance embedded computing more available. Although automobile computing has developed and benefited enormously from these more recent technologies, the costs are still high, prohibitively high in some cases for broader adaptation, particularly for the after-market and enthusiast markets. This project aims to implement a Raspberry Pi-based low-power (under one hundred Watts) highly mobile computing cluster for a neural network. The computing cluster built from off-the-shelf components is more affordable and, therefore, makes wider adoption possible. The paper describes the design of the neural network, Raspberry Pi-based cluster, and applications the cluster will run. The neural network will use input data from sensors and cameras to project a live view of the road state as the user drives. The neural network will be trained to predict traffic behavior and generate warnings when potentially dangerous situations are predicted. The significant outcomes of this study will be two folds, firstly, to implement and test the low-cost cluster, and secondly, to ascertain the effectiveness of the predictive AI implemented on the cluster.Keywords: CS pedagogy, student research, cluster computing, machine learning
Procedia PDF Downloads 10222886 Modeling Usage Patterns of Mobile App Service in App Market Using Hidden Markov Model
Authors: Yangrae Cho, Jinseok Kim, Yongtae Park
Abstract:
Mobile app service ecosystem has been abruptly emerged, explosively grown, and dynamically transformed. In contrast with product markets in which product sales directly cause increment in firm’s income, customer’s usage is less visible but more valuable in service market. Especially, the market situation with cutthroat competition in mobile app store makes securing and keeping of users as vital. Although a few service firms try to manage their apps’ usage patterns by fitting on S-curve or applying other forecasting techniques, the time series approaches based on past sequential data are subject to fundamental limitation in the market where customer’s attention is being moved unpredictably and dynamically. We therefore propose a new conceptual approach for detecting usage pattern of mobile app service with Hidden Markov Model (HMM) which is based on the dual stochastic structure and mainly used to clarify unpredictable and dynamic sequential patterns in voice recognition or stock forecasting. Our approach could be practically utilized for app service firms to manage their services’ lifecycles and academically expanded to other markets.Keywords: mobile app service, usage pattern, Hidden Markov Model, pattern detection
Procedia PDF Downloads 33722885 Economic Analysis of Interaction Freedom, Institutions and Development in the countries of North Africa: Amartya Sen Approach of Capability
Authors: Essardi Omar, Razzouk Redouane
Abstract:
The concept of freedom requires notice of countries all over the world to consider welfare and the quality of life. Despite, many economics efforts in the field of development literature, they have often failed to incorporate the ideas of freedom and rights into their theoretical and empirical work. However, with Amartya Sen’s approach of capability and researches, we can provide a basis for moving forward in theory and measure of development. Indeed, with an approach based on the correlation and the analysis of data, particularly on the tool of principle component analysis, we are going to study assessments of World Bank, Freedom House, Fraster institute, and MINEFE experts. Our empirical objective is to reveal the existence of the institutional and freedom characteristics related to the development of the emergent countries. In order to help us to explain the recent performance reached by Central and Eastern Europe and Latine America in compared with the case of countries of North Africa. To do this, first we will try to build indicators based on dilemma liberties /institutions. Second we will introduce institutional variables and freedom variables to make comparisons in freedom, quality of institutions and development in the countries observed.Keywords: freedoms, institutions, development, approach of capability, principle component analysis
Procedia PDF Downloads 42922884 Blockchain-Based Decentralized Architecture for Secure Medical Records Management
Authors: Saeed M. Alshahrani
Abstract:
This research integrated blockchain technology to reform medical records management in healthcare informatics. It was aimed at resolving the limitations of centralized systems by establishing a secure, decentralized, and user-centric platform. The system was architected with a sophisticated three-tiered structure, integrating advanced cryptographic methodologies, consensus algorithms, and the Fast Healthcare Interoperability Resources (HL7 FHIR) standard to ensure data security, transaction validity, and semantic interoperability. The research has profound implications for healthcare delivery, patient care, legal compliance, operational efficiency, and academic advancements in blockchain technology and healthcare IT sectors. The methodology adapted in this research comprises of Preliminary Feasibility Study, Literature Review, Design and Development, Cryptographic Algorithm Integration, Modeling the data and testing the system. The research employed a permissioned blockchain with a Practical Byzantine Fault Tolerance (PBFT) consensus algorithm and Ethereum-based smart contracts. It integrated advanced cryptographic algorithms, role-based access control, multi-factor authentication, and RESTful APIs to ensure security, regulate access, authenticate user identities, and facilitate seamless data exchange between the blockchain and legacy healthcare systems. The research contributed to the development of a secure, interoperable, and decentralized system for managing medical records, addressing the limitations of the centralized systems that were in place. Future work will delve into optimizing the system further, exploring additional blockchain use cases in healthcare, and expanding the adoption of the system globally, contributing to the evolution of global healthcare practices and policies.Keywords: healthcare informatics, blockchain, medical records management, decentralized architecture, data security, cryptographic algorithms
Procedia PDF Downloads 5522883 Metamorphic Computer Virus Classification Using Hidden Markov Model
Authors: Babak Bashari Rad
Abstract:
A metamorphic computer virus uses different code transformation techniques to mutate its body in duplicated instances. Characteristics and function of new instances are mostly similar to their parents, but they cannot be easily detected by the majority of antivirus in market, as they depend on string signature-based detection techniques. The purpose of this research is to propose a Hidden Markov Model for classification of metamorphic viruses in executable files. In the proposed solution, portable executable files are inspected to extract the instructions opcodes needed for the examination of code. A Hidden Markov Model trained on portable executable files is employed to classify the metamorphic viruses of the same family. The proposed model is able to generate and recognize common statistical features of mutated code. The model has been evaluated by examining the model on a test data set. The performance of the model has been practically tested and evaluated based on False Positive Rate, Detection Rate and Overall Accuracy. The result showed an acceptable performance with high average of 99.7% Detection Rate.Keywords: malware classification, computer virus classification, metamorphic virus, metamorphic malware, Hidden Markov Model
Procedia PDF Downloads 31522882 Bronchoscopy and Genexpert in the Diagnosis of Pulmonary Tuberculosis in the Indian Private Health Sector: A Short Case Series
Authors: J. J. Mathew
Abstract:
Pulmonary tuberculosis is highly prevalent in the Indian subcontinent. Most cases of pulmonary tuberculosis are diagnosed with sputum examinations and the vast majority of these are undertaken by the government run establishments. However, mycobacterial cultures are not routinely done, unless drug resistance is detected based on clinical response. Modern diagnostic tests like bronchoscopy and Genexpert are not routinely employed in the government institutions for the diagnosis of pulmonary tuberculosis, but have been accepted widely by good private institutions. The utility of these investigations in the private sector is not yet well recognized. This retrospective study aims to assess the usefulness of bronchoscopy and Genexpert in the diagnosis of pulmonary tuberculosis in quaternary care private hospital in India. 30 patients with respiratory symptoms raising the possibility of tuberculosis based on clinical and radiological features, but without any significant sputum production, were subject to bronchoscopy and BAL samples taken for microbiological studies, including Genexpert. 6 out of the 30 patients were found to be Genexpert positive and none of them showed Rifampicin resistance. All the 6 cases had upper zone predominant disease. One of the 6 cases of tuberculosis had another co-existent bacterial infection according to the routine culture studies. 6 other cases were proven to be due to other bacterial infections alone, 2 had a malignant diagnosis and the remaining cases were thought to be non-infective pathologies. The Genexpert results were made available within 48 hours in the 6 positive cases. All of them were commenced on standard anti-tuberculous regimen with excellent clinical response. The other infective cases were also managed successfully based on the drug susceptibilities. The study has shown the usefulness of these investigations as early intervention enabled diagnosis facilitating treatment and prevention of any clinical deterioration. The study lends support to early bronchoscopy and Genexpert testing in suspected cases of pulmonary tuberculosis without significant sputum production, in a high prevalence country which normally relies on sputum examination for the diagnosis of pulmonary tuberculosis.Keywords: pulmonary, tuberculosis, bronchoscopy, genexpert
Procedia PDF Downloads 24522881 Criterion-Referenced Test Reliability through Threshold Loss Agreement: Fuzzy Logic Analysis Approach
Authors: Mohammad Ali Alavidoost, Hossein Bozorgian
Abstract:
Criterion-referenced tests (CRTs) are designed to measure student performance against a fixed set of predetermined criteria or learning standards. The reliability of such tests cannot be based on internal reliability. Threshold loss agreement is one way to calculate the reliability of CRTs. However, the selection of master and non-master in such agreement is determined by the threshold point. The problem is if the threshold point witnesses a minute change, the selection of master and non-master may have a drastic change, leading to the change in reliability results. Therefore, in this study, the Fuzzy logic approach is employed as a remedial procedure for data analysis to obviate the threshold point problem. Forty-one Iranian students were selected; the participants were all between 20 and 30 years old. A quantitative approach was used to address the research questions. In doing so, a quasi-experimental design was utilized since the selection of the participants was not randomized. Based on the Fuzzy logic approach, the threshold point would be more stable during the analysis, resulting in rather constant reliability results and more precise assessment.Keywords: criterion-referenced tests, threshold loss agreement, threshold point, fuzzy logic approach
Procedia PDF Downloads 36922880 Performance of Derna Steam Power Plant at Varying Super-Heater Operating Conditions Based on Exergy
Authors: Idris Elfeituri
Abstract:
In the current study, energy and exergy analysis of a 65 MW steam power plant was carried out. This study investigated the effect of variations of overall conductance of the super heater on the performance of an existing steam power plant located in Derna, Libya. The performance of the power plant was estimated by a mathematical modelling which considers the off-design operating conditions of each component. A fully interactive computer program based on the mass, energy and exergy balance equations has been developed. The maximum exergy destruction has been found in the steam generation unit. A 50% reduction in the design value of overall conductance of the super heater has been achieved, which accordingly decreases the amount of the net electrical power that would be generated by at least 13 MW, as well as the overall plant exergy efficiency by at least 6.4%, and at the same time that would cause an increase of the total exergy destruction by at least 14 MW. The achieved results showed that the super heater design and operating conditions play an important role on the thermodynamics performance and the fuel utilization of the power plant. Moreover, these considerations are very useful in the process of the decision that should be taken at the occasions of deciding whether to replace or renovate the super heater of the power plant.Keywords: Exergy, Super-heater, Fouling; Steam power plant; Off-design., Fouling;, Super-heater, Steam power plant
Procedia PDF Downloads 33322879 Prospects for the Development of e-Commerce in Georgia
Authors: Nino Damenia
Abstract:
E-commerce opens a new horizon for business development, which is why the presence of e-commerce is a necessary condition for the formation, growth, and development of the country's economy. Worldwide, e-commerce turnover is growing at a high rate every year, as the electronic environment provides great opportunities for product promotion. E-commerce in Georgia is developing at a fast pace, but it is still a relatively young direction in the country's economy. Movement restrictions and other public health measures caused by the COVID-19 pandemic have reduced economic activity in most economic sectors and countries, significantly affecting production, distribution, and consumption. The pandemic has accelerated digital transformation. Digital solutions enable people and businesses to continue part of their economic and social activities remotely. This has also led to the growth of e-commerce. According to the data of the National Statistics Service of Georgia, the share of online trade is higher in cities (27.4%) than in rural areas (9.1%). The COVID-19 pandemic has forced local businesses to expand their digital offerings. The size of the local market increased 3.2 times in 2020 to 138 million GEL. And in 2018-2020, the share of local e-commerce increased from 11% to 23%. In Georgia, the state is actively engaged in the promotion of activities based on information technologies. Many measures have been taken for this purpose, but compared to other countries, this process is slow in Georgia. The purpose of the study is to determine development prospects for the economy of Georgia based on the analysis of electronic commerce. Research was conducted around the issues using Georgian and foreign scientists' articles, works, reports of international organizations, collections of scientific conferences, and scientific electronic databases. The empirical base of the research is the data and annual reports of the National Statistical Service of Georgia, internet resources of world statistical materials, and others. While working on the article, a questionnaire was developed, based on which an electronic survey of certain types of respondents was conducted. The conducted research was related to determining how intensively Georgian citizens use online shopping, including which age category uses electronic commerce, for what purposes, and how satisfied they are. Various theoretical and methodological research tools, as well as analysis, synthesis, comparison, and other types of methods, are used to achieve the set goal in the research process. The research results and recommendations will contribute to the development of e-commerce in Georgia and economic growth based on it.Keywords: e-commerce, information technology, pandemic, digital transformation
Procedia PDF Downloads 7522878 Yield and Sward Composition Responses of Natural Grasslands to Treatments Meeting Sustainability
Authors: D. Díaz Fernández, I. Csízi, K. Pető, G. Nagy
Abstract:
An outstanding part of the animal products are based on the grasslands, due to the fact that the grassland ecosystems can be found all over the globe. In places where economical and successful crop production cannot be managed, the grassland based animal husbandry can be an efficient way of food production. In addition, these ecosystems have an important role in carbon sequestration, and with their rich flora – and fauna connected to it – in conservation of biodiversity. The protection of nature, and the sustainable agriculture is getting more and more attention in the European Union, but, looking at the consumers’ needs, the production of healthy food cannot be neglected either. Because of these facts, the effects of two specific composts - which are officially authorized in organic farming, in Agri-environment Schemes and Natura 2000 programs – on grass yields and sward compositions were investigated in a field trial. The investigation took place in Hungary, on a natural grassland based on solonetz soil. Three rates of compost (10 t/ha, 20 t/ha, 30 t/ha) were tested on 3 m X 10 m experimental plots. Every treatment had four replications and both type of compost had four-four control plots too, this way 32 experimental plots were included in the investigations. The yield of the pasture was harvested two-times (in May and in September) and before cutting the plots, measurements on botanical compositions were made. Samples for laboratory analysis were also taken. Dry matter yield of pasture showed positive responses to the rates of composts. The increase in dry matter yield was partly due to some positive changes in sward composition. It means that the proportions of grass species with higher yield potential increased in ground cover of the sward without depressing out valuable native species of diverse natural grasslands. The research results indicate that the use of organic compost can be an efficient way to increase grass yields in a sustainable way.Keywords: compost application, dry matter yield, native grassland, sward composition
Procedia PDF Downloads 24922877 Development of a Highly Flexible, Sensitive and Stretchable Polymer Nanocomposite for Strain Sensing
Authors: Shaghayegh Shajari, Mehdi Mahmoodi, Mahmood Rajabian, Uttandaraman Sundararaj, Les J. Sudak
Abstract:
Although several strain sensors based on carbon nanotubes (CNTs) have been reported, the stretchability and sensitivity of these sensors have remained as a challenge. Highly stretchable and sensitive strain sensors are in great demand for human motion monitoring and human-machine interface. This paper reports the fabrication and characterization of a new type of strain sensors based on a stretchable fluoropolymer / CNT nanocomposite system made via melt-mixing technique. Electrical and mechanical characterizations were obtained. The results showed that this nanocomposite sensor has high stretchability up to 280% of strain at an optimum level of filler concentration. The piezoresistive properties and the strain sensing mechanism of the strain sensor were investigated using Electrochemical Impedance Spectroscopy (EIS). High sensitivity was obtained (gauge factor as large as 12000 under 120% applied strain) in particular at the concentrations above the percolation threshold. Due to the tunneling effect, a non- linear piezoresistivity was observed at high concentrations of CNT loading. The nanocomposites with good conductivity and lightweight could be a promising candidate for strain sensing applications.Keywords: carbon nanotubes, fluoropolymer, piezoresistive, strain sensor
Procedia PDF Downloads 29622876 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events
Authors: Jaqueline Maria Ribeiro Vieira
Abstract:
Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). Previously we developed and proposed a novel strategy capable of detecting patterns at borehole images that may point to regions that have tension and breakout characteristics, based on segmented images. In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge data set configurations.Keywords: image segmentation, oil well visualization, classifiers, data-mining, visual computer
Procedia PDF Downloads 30322875 A Computational Analysis of Gas Jet Flow Effects on Liquid Aspiration in the Collison Nebulizer
Authors: James Q. Feng
Abstract:
Pneumatic nebulizers (as variations based on the Collison nebulizer) have been widely used for producing fine aerosol droplets from a liquid material. As qualitatively described by many authors, the basic working principle of those nebulizers involves utilization of the negative pressure associated with an expanding gas jet to syphon liquid into the jet stream, then to blow and shear into liquid sheets, filaments, and eventually droplets. But detailed quantitative analysis based on fluid mechanics theory has been lacking in the literature. The purpose of present work is to investigate the nature of negative pressure distribution associated with compressible gas jet flow in the Collison nebulizer by a computational fluid dynamics (CFD) analysis, using an OpenFOAM® compressible flow solver. The value of the negative pressure associated with a gas jet flow is examined by varying geometric parameters of the jet expansion channel adjacent to the jet orifice outlet. Such an analysis can provide valuable insights into fundamental mechanisms in liquid aspiration process, helpful for effective design of the pneumatic atomizer in the Aerosol Jet® direct-write system for micro-feature, high-aspect-ratio material deposition in additive manufacturing.Keywords: collison nebulizer, compressible gas jet flow, liquid aspiration, pneumatic atomization
Procedia PDF Downloads 18022874 Testing Causal Model of Depression Based on the Components of Subscales Lifestyle with Mediation of Social Health
Authors: Abdolamir Gatezadeh, Jamal Daghaleh
Abstract:
The lifestyle of individuals is important and determinant for the status of psychological and social health. Recently, especially in developed countries, the relationship between lifestyle and mental illnesses, including depression, has attracted the attention of many people. In order to test the causal model of depression based on lifestyle with mediation of social health in the study, basic and applied methods were used in terms of objective and descriptive-field as well as the data collection. Methods: This study is a basic research type and is in the framework of correlational plans. In this study, the population includes all adults in Ahwaz city. A randomized, multistage sampling of 384 subjects was selected as the subjects. Accordingly, the data was collected and analyzed using structural equation modeling. Results: In data analysis, path analysis indicated the confirmation of the assumed model fit of research. This means that subscales lifestyle has a direct effect on depression and subscales lifestyle through the mediation of social health which in turn has an indirect effect on depression. Discussion and conclusion: According to the results of the research, the depression can be used to explain the components of the lifestyle and social health.Keywords: depression, subscales lifestyle, social health, causal model
Procedia PDF Downloads 163