Search results for: decentralized data management
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30748

Search results for: decentralized data management

26548 Information Communication Technology Based Road Traffic Accidents’ Identification, and Related Smart Solution Utilizing Big Data

Authors: Ghulam Haider Haidaree, Nsenda Lukumwena

Abstract:

Today the world of research enjoys abundant data, available in virtually any field, technology, science, and business, politics, etc. This is commonly referred to as big data. This offers a great deal of precision and accuracy, supportive of an in-depth look at any decision-making process. When and if well used, Big Data affords its users with the opportunity to produce substantially well supported and good results. This paper leans extensively on big data to investigate possible smart solutions to urban mobility and related issues, namely road traffic accidents, its casualties, and fatalities based on multiple factors, including age, gender, location occurrences of accidents, etc. Multiple technologies were used in combination to produce an Information Communication Technology (ICT) based solution with embedded technology. Those technologies include principally Geographic Information System (GIS), Orange Data Mining Software, Bayesian Statistics, to name a few. The study uses the Leeds accident 2016 to illustrate the thinking process and extracts thereof a model that can be tested, evaluated, and replicated. The authors optimistically believe that the proposed model will significantly and smartly help to flatten the curve of road traffic accidents in the fast-growing population densities, which increases considerably motor-based mobility.

Keywords: accident factors, geographic information system, information communication technology, mobility

Procedia PDF Downloads 204
26547 Mindfulness and Mental Resilience Training for Pilots: Enhancing Cognitive Performance and Stress Management

Authors: Nargiza Nuralieva

Abstract:

The study delves into assessing the influence of mindfulness and mental resilience training on the cognitive performance and stress management of pilots. Employing a meticulous literature search across databases such as Medline and Google Scholar, the study used specific keywords to target a wide array of studies. Inclusion criteria were stringent, focusing on peer-reviewed studies in English that utilized designs like randomized controlled trials, with a specific interest in interventions related to mindfulness or mental resilience training for pilots and measured outcomes pertaining to cognitive performance and stress management. The initial literature search identified a pool of 123 articles, with subsequent screening resulting in the exclusion of 77 based on title and abstract. The remaining 54 articles underwent a more rigorous full-text screening, leading to the exclusion of 41. Additionally, five studies were selected from the World Health Organization's clinical trials database. A total of 11 articles from meta-analyses were retained for examination, underscoring the study's dedication to a meticulous and robust inclusion process. The interventions varied widely, incorporating mixed approaches, Cognitive behavioral Therapy (CBT)-based, and mindfulness-based techniques. The analysis uncovered positive effects across these interventions. Specifically, mixed interventions demonstrated a Standardized Mean Difference (SMD) of 0.54, CBT-based interventions showed an SMD of 0.29, and mindfulness-based interventions exhibited an SMD of 0.43. Long-term effects at a 6-month follow-up suggested sustained impacts for both mindfulness-based (SMD: 0.63) and CBT-based interventions (SMD: 0.73), albeit with notable heterogeneity.

Keywords: mindfulness, mental resilience, pilots, cognitive performance, stress management

Procedia PDF Downloads 50
26546 Competition between Regression Technique and Statistical Learning Models for Predicting Credit Risk Management

Authors: Chokri Slim

Abstract:

The objective of this research is attempting to respond to this question: Is there a significant difference between the regression model and statistical learning models in predicting credit risk management? A Multiple Linear Regression (MLR) model was compared with neural networks including Multi-Layer Perceptron (MLP), and a Support vector regression (SVR). The population of this study includes 50 listed Banks in Tunis Stock Exchange (TSE) market from 2000 to 2016. Firstly, we show the factors that have significant effect on the quality of loan portfolios of banks in Tunisia. Secondly, it attempts to establish that the systematic use of objective techniques and methods designed to apprehend and assess risk when considering applications for granting credit, has a positive effect on the quality of loan portfolios of banks and their future collectability. Finally, we will try to show that the bank governance has an impact on the choice of methods and techniques for analyzing and measuring the risks inherent in the banking business, including the risk of non-repayment. The results of empirical tests confirm our claims.

Keywords: credit risk management, multiple linear regression, principal components analysis, artificial neural networks, support vector machines

Procedia PDF Downloads 146
26545 Analysis of ECGs Survey Data by Applying Clustering Algorithm

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring the prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 347
26544 An Automated Business Process Management for Smart Medical Records

Authors: K. Malak, A. Nourah, S.Liyakathunisa

Abstract:

Nowadays, healthcare services are facing many challenges since they are becoming more complex and more needed. Every detail of a patient’s interactions with health care providers is maintained in Electronic Health Records (ECR) and Healthcare information systems (HIS). However, most of the existing systems are often focused on documenting what happens in manual health care process, rather than providing the highest quality patient care. Healthcare business processes and stakeholders can no longer rely on manual processes, to provide better patient care and efficient utilization of resources, Healthcare processes must be automated wherever it is possible. In this research, a detail survey and analysis is performed on the existing health care systems in Saudi Arabia, and an automated smart medical healthcare business process model is proposed. The business process management methods and rules are followed in discovering, collecting information, analysis, redesign, implementation and performance improvement analysis in terms of time and cost. From the simulation results, it is evident that our proposed smart medical records system can improve the quality of the service by reducing the time and cost and increasing efficiency

Keywords: business process management, electronic health records, efficiency, cost, time

Procedia PDF Downloads 335
26543 Improved Classification Procedure for Imbalanced and Overlapped Situations

Authors: Hankyu Lee, Seoung Bum Kim

Abstract:

The issue with imbalance and overlapping in the class distribution becomes important in various applications of data mining. The imbalanced dataset is a special case in classification problems in which the number of observations of one class (i.e., major class) heavily exceeds the number of observations of the other class (i.e., minor class). Overlapped dataset is the case where many observations are shared together between the two classes. Imbalanced and overlapped data can be frequently found in many real examples including fraud and abuse patients in healthcare, quality prediction in manufacturing, text classification, oil spill detection, remote sensing, and so on. The class imbalance and overlap problem is the challenging issue because this situation degrades the performance of most of the standard classification algorithms. In this study, we propose a classification procedure that can effectively handle imbalanced and overlapped datasets by splitting data space into three parts: nonoverlapping, light overlapping, and severe overlapping and applying the classification algorithm in each part. These three parts were determined based on the Hausdorff distance and the margin of the modified support vector machine. An experiments study was conducted to examine the properties of the proposed method and compared it with other classification algorithms. The results showed that the proposed method outperformed the competitors under various imbalanced and overlapped situations. Moreover, the applicability of the proposed method was demonstrated through the experiment with real data.

Keywords: classification, imbalanced data with class overlap, split data space, support vector machine

Procedia PDF Downloads 305
26542 Water Monitoring Sentinel Cloud Platform: Water Monitoring Platform Based on Satellite Imagery and Modeling Data

Authors: Alberto Azevedo, Ricardo Martins, André B. Fortunato, Anabela Oliveira

Abstract:

Water is under severe threat today because of the rising population, increased agricultural and industrial needs, and the intensifying effects of climate change. Due to sea-level rise, erosion, and demographic pressure, the coastal regions are of significant concern to the scientific community. The Water Monitoring Sentinel Cloud platform (WORSICA) service is focused on providing new tools for monitoring water in coastal and inland areas, taking advantage of remote sensing, in situ and tidal modeling data. WORSICA is a service that can be used to determine the coastline, coastal inundation areas, and the limits of inland water bodies using remote sensing (satellite and Unmanned Aerial Vehicles - UAVs) and in situ data (from field surveys). It applies to various purposes, from determining flooded areas (from rainfall, storms, hurricanes, or tsunamis) to detecting large water leaks in major water distribution networks. This service was built on components developed in national and European projects, integrated to provide a one-stop-shop service for remote sensing information, integrating data from the Copernicus satellite and drone/unmanned aerial vehicles, validated by existing online in-situ data. Since WORSICA is operational using the European Open Science Cloud (EOSC) computational infrastructures, the service can be accessed via a web browser and is freely available to all European public research groups without additional costs. In addition, the private sector will be able to use the service, but some usage costs may be applied, depending on the type of computational resources needed by each application/user. Although the service has three main sub-services i) coastline detection; ii) inland water detection; iii) water leak detection in irrigation networks, in the present study, an application of the service to Óbidos lagoon in Portugal is shown, where the user can monitor the evolution of the lagoon inlet and estimate the topography of the intertidal areas without any additional costs. The service has several distinct methodologies implemented based on the computations of the water indexes (e.g., NDWI, MNDWI, AWEI, and AWEIsh) retrieved from the satellite image processing. In conjunction with the tidal data obtained from the FES model, the system can estimate a coastline with the corresponding level or even topography of the inter-tidal areas based on the Flood2Topo methodology. The outcomes of the WORSICA service can be helpful for several intervention areas such as i) emergency by providing fast access to inundated areas to support emergency rescue operations; ii) support of management decisions on hydraulic infrastructures operation to minimize damage downstream; iii) climate change mitigation by minimizing water losses and reduce water mains operation costs; iv) early detection of water leakages in difficult-to-access water irrigation networks, promoting their fast repair.

Keywords: remote sensing, coastline detection, water detection, satellite data, sentinel, Copernicus, EOSC

Procedia PDF Downloads 120
26541 Leveraging Mobile Apps for Citizen-Centric Urban Planning: Insights from Tajawob Implementation

Authors: Alae El Fahsi

Abstract:

This study explores the ‘Tajawob’ app's role in urban development, demonstrating how mobile applications can empower citizens and facilitate urban planning. Tajawob serves as a digital platform for community feedback, engagement, and participatory governance, addressing urban challenges through innovative tech solutions. This research synthesizes data from a variety of sources, including user feedback, engagement metrics, and interviews with city officials, to assess the app’s impact on citizen participation in urban development in Morocco. By integrating advanced data analytics and user experience design, Tajawob has bridged the communication gap between citizens and government officials, fostering a more collaborative and transparent urban planning process. The findings reveal a significant increase in civic engagement, with users actively contributing to urban management decisions, thereby enhancing the responsiveness and inclusivity of urban governance. Challenges such as digital literacy, infrastructure limitations, and privacy concerns are also discussed, providing a comprehensive overview of the obstacles and opportunities presented by mobile app-based citizen engagement platforms. The study concludes with strategic recommendations for scaling the Tajawob model to other contexts, emphasizing the importance of adaptive technology solutions in meeting the evolving needs of urban populations. This research contributes to the burgeoning field of smart city innovations, offering key insights into the role of digital tools in facilitating more democratic and participatory urban environments.

Keywords: smart cities, digital governance, urban planning, strategic design

Procedia PDF Downloads 55
26540 Mapping of Geological Structures Using Aerial Photography

Authors: Ankit Sharma, Mudit Sachan, Anurag Prakash

Abstract:

Rapid growth in data acquisition technologies through drones, have led to advances and interests in collecting high-resolution images of geological fields. Being advantageous in capturing high volume of data in short flights, a number of challenges have to overcome for efficient analysis of this data, especially while data acquisition, image interpretation and processing. We introduce a method that allows effective mapping of geological fields using photogrammetric data of surfaces, drainage area, water bodies etc, which will be captured by airborne vehicles like UAVs, we are not taking satellite images because of problems in adequate resolution, time when it is captured may be 1 yr back, availability problem, difficult to capture exact image, then night vision etc. This method includes advanced automated image interpretation technology and human data interaction to model structures and. First Geological structures will be detected from the primary photographic dataset and the equivalent three dimensional structures would then be identified by digital elevation model. We can calculate dip and its direction by using the above information. The structural map will be generated by adopting a specified methodology starting from choosing the appropriate camera, camera’s mounting system, UAVs design ( based on the area and application), Challenge in air borne systems like Errors in image orientation, payload problem, mosaicing and geo referencing and registering of different images to applying DEM. The paper shows the potential of using our method for accurate and efficient modeling of geological structures, capture particularly from remote, of inaccessible and hazardous sites.

Keywords: digital elevation model, mapping, photogrammetric data analysis, geological structures

Procedia PDF Downloads 683
26539 The Role of Demographics and Service Quality in the Adoption and Diffusion of E-Government Services: A Study in India

Authors: Sayantan Khanra, Rojers P. Joseph

Abstract:

Background and Significance: This study is aimed at analyzing the role of demographic and service quality variables in the adoption and diffusion of e-government services among the users in India. The study proposes to examine the users' perception about e-Government services and investigate the key variables that are most salient to the Indian populace. Description of the Basic Methodologies: The methodology to be adopted in this study is Hierarchical Regression Analysis, which will help in exploring the impact of the demographic variables and the quality dimensions on the willingness to use e-government services in two steps. First, the impact of demographic variables on the willingness to use e-government services is to be examined. In the second step, quality dimensions would be used as inputs to the model for explaining variance in excess of prior contribution by the demographic variables. Present Status: Our study is in the data collection stage in collaboration with a highly reliable, authentic and adequate source of user data. Assuming that the population of the study comprises all the Internet users in India, a massive sample size of more than 10,000 random respondents is being approached. Data is being collected using an online survey questionnaire. A pilot survey has already been carried out to refine the questionnaire with inputs from an expert in management information systems and a small group of users of e-government services in India. The first three questions in the survey pertain to the Internet usage pattern of a respondent and probe whether the person has used e-government services. If the respondent confirms that he/she has used e-government services, then an aggregate of 15 indicators are used to measure the quality dimensions under consideration and the willingness of the respondent to use e-government services, on a five-point Likert scale. If the respondent reports that he/she has not used e-government services, then a few optional questions are asked to understand the reason(s) behind the same. Last four questions in the survey are dedicated to collect data related to the demographic variables. An indication of the Major Findings: Based on the extensive literature review carried out to develop several propositions; a research model is prescribed to start with. A major outcome expected at the completion of the study is the development of a research model that would help to understand the relationship involving the demographic variables and service quality dimensions, and the willingness to adopt e-government services, particularly in an emerging economy like India. Concluding Statement: Governments of emerging economies and other relevant agencies can use the findings from the study in designing, updating, and promoting e-government services to enhance public participation, which in turn, would help to improve efficiency, convenience, engagement, and transparency in implementing these services.

Keywords: adoption and diffusion of e-government services, demographic variables, hierarchical regression analysis, service quality dimensions

Procedia PDF Downloads 261
26538 Pain Management in Burn Wounds with Dual Drug Loaded Double Layered Nano-Fiber Based Dressing

Authors: Sharjeel Abid, Tanveer Hussain, Ahsan Nazir, Abdul Zahir, Nabyl Khenoussi

Abstract:

Localized application of drug has various advantages and fewer side effects as compared with other methods. Burn patients suffer from swear pain and the major aspects that are considered for burn victims include pain and infection management. Nano-fibers (NFs) loaded with drug, applied on local wound area, can solve these problems. Therefore, this study dealt with the fabrication of drug loaded NFs for better pain management. Two layers of NFs were fabricated with different drugs. Contact layer was loaded with Gabapentin (a nerve painkiller) and the second layer with acetaminophen. The fabricated dressing was characterized using scanning electron microscope, Fourier Transform Infrared Spectroscopy, X-Ray Diffraction and UV-Vis Spectroscopy. The double layered based NFs dressing was designed to have both initial burst release followed by slow release to cope with pain for two days. The fabricated nanofibers showed diameter < 300 nm. The liquid absorption capacity of the NFs was also checked to deal with the exudate. The fabricated double layered dressing with dual drug loading and release showed promising results that could be used for dealing pain in burn victims. It was observed that by the addition of drug, the size of nanofibers was reduced, on the other hand, the crystallinity %age was increased, and liquid absorption decreased. The combination of fast nerve pain killer release followed by slow release of non-steroidal anti-inflammatory drug could be a good tool to reduce pain in a more secure manner with fewer side effects.

Keywords: pain management, burn wounds, nano-fibers, controlled drug release

Procedia PDF Downloads 247
26537 Predicting Seoul Bus Ridership Using Artificial Neural Network Algorithm with Smartcard Data

Authors: Hosuk Shin, Young-Hyun Seo, Eunhak Lee, Seung-Young Kho

Abstract:

Currently, in Seoul, users have the privilege to avoid riding crowded buses with the installation of Bus Information System (BIS). BIS has three levels of on-board bus ridership level information (spacious, normal, and crowded). However, there are flaws in the system due to it being real time which could provide incomplete information to the user. For example, a bus comes to the station, and on the BIS it shows that the bus is crowded, but on the stop that the user is waiting many people get off, which would mean that this station the information should show as normal or spacious. To fix this problem, this study predicts the bus ridership level using smart card data to provide more accurate information about the passenger ridership level on the bus. An Artificial Neural Network (ANN) is an interconnected group of nodes, that was created based on the human brain. Forecasting has been one of the major applications of ANN due to the data-driven self-adaptive methods of the algorithm itself. According to the results, the ANN algorithm was stable and robust with somewhat small error ratio, so the results were rational and reasonable.

Keywords: smartcard data, ANN, bus, ridership

Procedia PDF Downloads 162
26536 Analysis and Design Modeling for Next Generation Network Intrusion Detection and Prevention System

Authors: Nareshkumar Harale, B. B. Meshram

Abstract:

The continued exponential growth of successful cyber intrusions against today’s businesses has made it abundantly clear that traditional perimeter security measures are no longer adequate and effective. We evolved the network trust architecture from trust-untrust to Zero-Trust, With Zero Trust, essential security capabilities are deployed in a way that provides policy enforcement and protection for all users, devices, applications, data resources, and the communications traffic between them, regardless of their location. Information exchange over the Internet, in spite of inclusion of advanced security controls, is always under innovative, inventive and prone to cyberattacks. TCP/IP protocol stack, the adapted standard for communication over network, suffers from inherent design vulnerabilities such as communication and session management protocols, routing protocols and security protocols are the major cause of major attacks. With the explosion of cyber security threats, such as viruses, worms, rootkits, malwares, Denial of Service attacks, accomplishing efficient and effective intrusion detection and prevention is become crucial and challenging too. In this paper, we propose a design and analysis model for next generation network intrusion detection and protection system as part of layered security strategy. The proposed system design provides intrusion detection for wide range of attacks with layered architecture and framework. The proposed network intrusion classification framework deals with cyberattacks on standard TCP/IP protocol, routing protocols and security protocols. It thereby forms the basis for detection of attack classes and applies signature based matching for known cyberattacks and data mining based machine learning approaches for unknown cyberattacks. Our proposed implemented software can effectively detect attacks even when malicious connections are hidden within normal events. The unsupervised learning algorithm applied to network audit data trails results in unknown intrusion detection. Association rule mining algorithms generate new rules from collected audit trail data resulting in increased intrusion prevention though integrated firewall systems. Intrusion response mechanisms can be initiated in real-time thereby minimizing the impact of network intrusions. Finally, we have shown that our approach can be validated and how the analysis results can be used for detecting and protection from the new network anomalies.

Keywords: network intrusion detection, network intrusion prevention, association rule mining, system analysis and design

Procedia PDF Downloads 224
26535 Improving Temporal Correlations in Empirical Orthogonal Function Expansions for Data Interpolating Empirical Orthogonal Function Algorithm

Authors: Ping Bo, Meng Yunshan

Abstract:

Satellite-derived sea surface temperature (SST) is a key parameter for many operational and scientific applications. However, the disadvantage of SST data is a high percentage of missing data which is mainly caused by cloud coverage. Data Interpolating Empirical Orthogonal Function (DINEOF) algorithm is an EOF-based technique for reconstructing the missing data and has been widely used in oceanographic field. The reconstruction of SST images within a long time series using DINEOF can cause large discontinuities and one solution for this problem is to filter the temporal covariance matrix to reduce the spurious variability. Based on the previous researches, an algorithm is presented in this paper to improve the temporal correlations in EOF expansion. Similar with the previous researches, a filter, such as Laplacian filter, is implemented on the temporal covariance matrix, but the temporal relationship between two consecutive images which is used in the filter is considered in the presented algorithm, for example, two images in the same season are more likely correlated than those in the different seasons, hence the latter one is less weighted in the filter. The presented approach is tested for the monthly nighttime 4-km Advanced Very High Resolution Radiometer (AVHRR) Pathfinder SST for the long-term period spanning from 1989 to 2006. The results obtained from the presented algorithm are compared to those from the original DINEOF algorithm without filtering and from the DINEOF algorithm with filtering but without taking temporal relationship into account.

Keywords: data interpolating empirical orthogonal function, image reconstruction, sea surface temperature, temporal filter

Procedia PDF Downloads 321
26534 Quality of Care of Medical Male Circumcisions: A Non-Negotiable for Right to Care

Authors: Nelson Igaba, C. Onaga, S. Hlongwane

Abstract:

Background: Medical Male Circumcision (MMC) is part of a comprehensive HIV prevention strategy. The quality of MMC done at Right To Care (RtC) sites is maintained by Continuous Quality Improvement (CQI) based on findings of assessments by internal and independent external assessors who evaluate such parameters as the quality of the surgical procedure, infection control, etc. There are 12 RtC MMC teams in Mpumalanga, two of which are headed by Medical Officers and 10 by Clinical Associates (Clin A). Objectives: To compare the quality (i) of care rendered at doctor headed sites (DHS) versus Clin A headed sites (CHS); (ii) of CQI assessments (external versus internal). Methodology: A retrospective review of data from RightMax™ (a novel RtC data management system) and CQI reports (external and internal) was done. CQI assessment scores of October 2015 and October 2016 were taken as the baseline and latest respectively. Four sites with 745-810 circumcisions per annum were purposively selected; the two DHS (group A) and two CHS (group B). Statistical analyses were conducted using R (2017 version). Results: There were no significant difference in latest CQI scores between the two groups (DHS and CHS) (Anova, F = 1.97, df = 1, P = 0.165); between internal and external CQI assessment scores (Anova, F = 2.251, df = 1, P = 0.139) or among the individual sites (Anova, F = 1.095, df = 2, P = 0.341). Of the total of 16 adverse events reported by the four sites in the 12 months reviewed (all were infections), there was no statistical evidence that the documented severity of the infection was different for DHS and CHS (Fisher’s exact test, p-value = 0.269). Conclusion: At RtC VMMC sites in Mpumalanga, internal and external/independent CQI assessments are comparable, and quality of care of VMMC is standardized with the performance of well-supervised clinical associates comparing well with those of medical officers.

Keywords: adverse events, Right to Care, male medical circumcision, continuous quality improvement

Procedia PDF Downloads 172
26533 An Exploratory Study on the Level of Awareness and Common Barriers of Physicians on Overweight and Obesity Management in Bangladesh

Authors: Kamrun Nahar Koly, Saimul Islam

Abstract:

Overweight and obesity is increasing at an alarming rate and a leading risk factor for morbidity throughout the world. In a country like Bangladesh where under nutrition and overweight both co-exist at the same time, but this issue has been underexplored as expected. The aim of the present study was to assess the knowledge, attitudes and identify the barriers of the physicians regarding overweight and obesity management on an urban hospital of Dhaka city in Bangladesh. A simple cross sectional study was conducted at two selected government and two private hospital to assess the knowledge, attitude and common barriers regarding overweight and obesity management among healthcare professionals. One hundred and fifty five physicians were surveyed. A standard questionnaire was constructed in local language and interview was administrated. Among the 155 physicians, majority 53 (34.20%) were working on SMC, 36 (23.20%) from DMC, 33 (21.30%) were based on SSMC and the rest 33 (21.30%) were from HFRCMH. Mean age of the study physicians were 31.88±5.92. Majority of the physicians 80 (51.60%) were not able to answer the correct prevalence of obesity but also a substantial number of them 75(48.40%) could mark the right answer. Among the physicians 150 (96.77%) reported BMI as a diagnostic index for overweight and obesity, where as 43 (27.74%) waist circumference, 30 (19.35%) waist hip ratio and 26 (16.77%) marked mid-arm circumference. A substantial proportion 71 (46.70%) of the physicians thought that they do not have much to do controlling weight problem in Bangladesh context though it has been opposed by 42 (27.60%) of the physicians and 39(25.70%) was neutral to comment. The majority of them 147 (96.1%) thought that a family based education program would be beneficial followed by 145 (94.8%) physicians mentioned about raising awareness among mothers as she is the primary caregiver. The idea of a school based education program will also help to early intervene referred by 142 (92.8%) of the physicians. Community based education program was also appreciated by 136 (89.5%) of the physicians. About 74 (47.7%) of them think that the patients still lack in motivation to maintain their weight properly at the same time too many patients to deal with can be a barrier as well assumed by 73 (47.1%) of them. Lack of national policy or management guideline can act as an obstacle told by 60 (38.7%) of the physicians. The relationship of practicing as a part of the general examination and chronic disease management was statistically significant (p<0.05) with physician occupational status. As besides, perceived barriers like lack of parents support, lack of a national policy was statistically significant (p<0.05) with physician occupational status. For the young physician, more training programme will be needed to transform their knowledge and attitude into practice. However, several important barriers interface for the physician treatment efforts and need to address.

Keywords: obesity management, physician, awareness, barriers, Bangladesh

Procedia PDF Downloads 163
26532 Sparse Unmixing of Hyperspectral Data by Exploiting Joint-Sparsity and Rank-Deficiency

Authors: Fanqiang Kong, Chending Bian

Abstract:

In this work, we exploit two assumed properties of the abundances of the observed signatures (endmembers) in order to reconstruct the abundances from hyperspectral data. Joint-sparsity is the first property of the abundances, which assumes the adjacent pixels can be expressed as different linear combinations of same materials. The second property is rank-deficiency where the number of endmembers participating in hyperspectral data is very small compared with the dimensionality of spectral library, which means that the abundances matrix of the endmembers is a low-rank matrix. These assumptions lead to an optimization problem for the sparse unmixing model that requires minimizing a combined l2,p-norm and nuclear norm. We propose a variable splitting and augmented Lagrangian algorithm to solve the optimization problem. Experimental evaluation carried out on synthetic and real hyperspectral data shows that the proposed method outperforms the state-of-the-art algorithms with a better spectral unmixing accuracy.

Keywords: hyperspectral unmixing, joint-sparse, low-rank representation, abundance estimation

Procedia PDF Downloads 251
26531 Electronic Physical Activity Record (EPAR): Key for Data Driven Physical Activity Healthcare Services

Authors: Rishi Kanth Saripalle

Abstract:

Medical experts highly recommend to include physical activity in everyone’s daily routine irrespective of gender or age as it helps to improve various medical issues or curb potential issues. Simultaneously, experts are also diligently trying to provide various healthcare services (interventions, plans, exercise routines, etc.) for promoting healthy living and increasing physical activity in one’s ever increasing hectic schedules. With the introduction of wearables, individuals are able to keep track, analyze, and visualize their daily physical activities. However, there seems to be no common agreed standard for representing, gathering, aggregating and analyzing an individual’s physical activity data from disparate multiple sources (exercise pans, multiple wearables, etc.). This issue makes it highly impractical to develop any data-driven physical activity applications and healthcare programs. Further, the inability to integrate the physical activity data into an individual’s Electronic Health Record to provide a wholistic image of that individual’s health is still eluding the experts. This article has identified three primary reasons for this potential issue. First, there is no agreed standard, both structure and semantic, for representing and sharing physical activity data across disparate systems. Second, various organizations (e.g., LA fitness, Gold’s Gym, etc.) and research backed interventions and programs still primarily rely on paper or unstructured format (such as text or notes) to keep track of the data generated from physical activities. Finally, most of the wearable devices operate in silos. This article identifies the underlying problem, explores the idea of reusing existing standards, and identifies the essential modules required to move forward.

Keywords: electronic physical activity record, physical activity in EHR EIM, tracking physical activity data, physical activity data standards

Procedia PDF Downloads 279
26530 Research on Coordination Strategies for Coordinating Supply Chain Based on Auction Mechanisms

Authors: Changtong Wang, Lingyun Wei

Abstract:

The combination of auctions and supply chains is of great significance in improving the supply chain management system and enhancing the efficiency of economic and social operations. To address the gap in research on supply chain strategies under the auction mechanism, a model is developed for the 1-N auction model in a complete information environment, and it is concluded that the two-part contract auction model for retailers in this model can achieve supply chain coordination. The model is validated by substituting the model into the scenario of a fresh-cut flower industry flower auction in exchange for arithmetic examples to further prove the validity of the conclusions.

Keywords: auction mechanism, supply chain coordination strategy, fresh cut flowers industry, supply chain management

Procedia PDF Downloads 118
26529 The Sustainable Governance of Aquifer Injection Using Treated Coal Seam Gas Water in Queensland, Australia: Lessons for Integrated Water Resource Management

Authors: Jacqui Robertson

Abstract:

The sustainable governance of groundwater is of the utmost importance in an arid country like Australia. Groundwater has been relied on by our agricultural and pastoral communities since the State was settled by European colonialists. Nevertheless, the rapid establishment of a coal seam gas (CSG) industry in Queensland, Australia, has had extensive impacts on the pre-existing groundwater users. Managed aquifer recharge of important aquifers in Queensland, Australia, using treated coal seam gas produced water has been used to reduce the impacts of CSG development in Queensland Australia. However, the process has not been widely adopted. Negative environmental outcomes are now acknowledged as not only engineering, scientific or technical problems to be solved but also the result of governance failures. An analysis of the regulatory context for aquifer injection using treated CSG water in Queensland, Australia, using Ostrom’s Common Pool Resource (CPR) theory and a ‘heat map’ designed by the author, highlights the importance of governance arrangements. The analysis reveals the costs and benefits for relevant stakeholders of artificial recharge of groundwater resources in this context. The research also reveals missed opportunities to further active management of the aquifer and resolve existing conflicts between users. The research illustrates the importance of strategically and holistically evaluating innovations in technology that impact water resources to reveal incentives that impact resource user behaviors. The paper presents a proactive step that can be adapted to support integrated water resource management and sustainable groundwater development.

Keywords: managed aquifer recharge, groundwater regulation, common-pool resources, integrated water resource management, Australia

Procedia PDF Downloads 228
26528 Nilsson Model Performance in Estimating Bed Load Sediment, Case Study: Tale Zang Station

Authors: Nader Parsazadeh

Abstract:

The variety of bed sediment load relationships, insufficient information and data, and the influence of river conditions make the selection of an optimum relationship for a given river extremely difficult. Hence, in order to select the best formulae, the bed load equations should be evaluated. The affecting factors need to be scrutinized, and equations should be verified. Also, re-evaluation may be needed. In this research, sediment bed load of Dez Dam at Tal-e Zang Station has been studied. After reviewing the available references, the most common formulae were selected that included Meir-Peter and Muller, using MS Excel to compute and evaluate data. Then, 52 series of already measured data at the station were re-measured, and the sediment bed load was determined. 1. The calculated bed load obtained by different equations showed a great difference with that of measured data. 2. r difference ratio from 0.5 to 2.00 was 0% for all equations except for Nilsson and Shields equations while it was 61.5 and 59.6% for Nilsson and Shields equations, respectively. 3. By reviewing results and discarding probably erroneous measured data measurements (by human or machine), one may use Nilsson Equation due to its r value higher than 1 as an effective equation for estimating bed load at Tal-e Zang Station in order to predict activities that depend upon bed sediment load estimate to be determined. Also, since only few studies have been conducted so far, these results may be of assistance to the operators and consulting companies.

Keywords: bed load, empirical relation ship, sediment, Tale Zang Station

Procedia PDF Downloads 358
26527 Leveraging Digital Transformation Initiatives and Artificial Intelligence to Optimize Readiness and Simulate Mission Performance across the Fleet

Authors: Justin Woulfe

Abstract:

Siloed logistics and supply chain management systems throughout the Department of Defense (DOD) has led to disparate approaches to modeling and simulation (M&S), a lack of understanding of how one system impacts the whole, and issues with “optimal” solutions that are good for one organization but have dramatic negative impacts on another. Many different systems have evolved to try to understand and account for uncertainty and try to reduce the consequences of the unknown. As the DoD undertakes expansive digital transformation initiatives, there is an opportunity to fuse and leverage traditionally disparate data into a centrally hosted source of truth. With a streamlined process incorporating machine learning (ML) and artificial intelligence (AI), advanced M&S will enable informed decisions guiding program success via optimized operational readiness and improved mission success. One of the current challenges is to leverage the terabytes of data generated by monitored systems to provide actionable information for all levels of users. The implementation of a cloud-based application analyzing data transactions, learning and predicting future states from current and past states in real-time, and communicating those anticipated states is an appropriate solution for the purposes of reduced latency and improved confidence in decisions. Decisions made from an ML and AI application combined with advanced optimization algorithms will improve the mission success and performance of systems, which will improve the overall cost and effectiveness of any program. The Systecon team constructs and employs model-based simulations, cutting across traditional silos of data, aggregating maintenance, and supply data, incorporating sensor information, and applying optimization and simulation methods to an as-maintained digital twin with the ability to aggregate results across a system’s lifecycle and across logical and operational groupings of systems. This coupling of data throughout the enterprise enables tactical, operational, and strategic decision support, detachable and deployable logistics services, and configuration-based automated distribution of digital technical and product data to enhance supply and logistics operations. As a complete solution, this approach significantly reduces program risk by allowing flexible configuration of data, data relationships, business process workflows, and early test and evaluation, especially budget trade-off analyses. A true capability to tie resources (dollars) to weapon system readiness in alignment with the real-world scenarios a warfighter may experience has been an objective yet to be realized to date. By developing and solidifying an organic capability to directly relate dollars to readiness and to inform the digital twin, the decision-maker is now empowered through valuable insight and traceability. This type of educated decision-making provides an advantage over the adversaries who struggle with maintaining system readiness at an affordable cost. The M&S capability developed allows program managers to independently evaluate system design and support decisions by quantifying their impact on operational availability and operations and support cost resulting in the ability to simultaneously optimize readiness and cost. This will allow the stakeholders to make data-driven decisions when trading cost and readiness throughout the life of the program. Finally, sponsors are available to validate product deliverables with efficiency and much higher accuracy than in previous years.

Keywords: artificial intelligence, digital transformation, machine learning, predictive analytics

Procedia PDF Downloads 155
26526 Participation, Network, Women’s Competency, and Government Policy Affecting on Community Development

Authors: Nopsarun Vannasirikul

Abstract:

The purposes of this research paper were to study the current situations of community development, women’s potentials, women’s participation, network, and government policy as well as to study the factors influencing women’s potentials, women’s participation, network, and government policy that have on the community development. The population included the women age of 18 years old who were living in the communities of Bangkok areas. This study was a mix research method of quantitative and qualitative method. A simple random sampling method was utilized to obtain 400 sample groups from 50 districts of Bangkok and to perform data collection by using questionnaire. Also, a purposive sampling method was utilized to obtain 12 informants for an in-depth interview to gain an in-sight information for quantitative method.

Keywords: community development, participation, network, women’s right, management

Procedia PDF Downloads 168
26525 Developments in corporate governance and economic growth in Sub Saharan Africa

Authors: Martha Matashu

Abstract:

This study examined corporate governance and economic growth trends in Sub Saharan African (SSA) countries. The need for corporate governance arise from the fact that the day to day running of the business is done by management who in accordance with the neoclassical theory and agency theory have inborn tendencies to use the resources of the company to their advantage. This prevails against a background where the endogenous economic growth theory hold the assumption that economic growth is an outcome of the overall performance of all companies within an economy. This suggest that corporate governance at firm level determine economic growth through its impact on the overall performance. Nevertheless, insight into literature suggest that efforts to promote corporate governance in countries across SSA since the 1980s to date have not yet yielded desired outcomes. The board responsibilities, shareholder rights, disclosure and transparency, protection of minority shareholder, and liability of directors were thus used as proxies of corporate governance because these are believed to be mechanisms that are believed to enhance company performance their effect on enhancing accountability and transparency. Using panel data techniques, corporate governance and economic growth data for 29 SSA countries from the period of 2008 to 2019 was analysed. The findings revealed declining economic growth trend despite an increase in corporate governance aspects such as director liability, shareholders’ rights, and protection of minority shareholder in SSA countries. These findings are in contradiction to the popularly held theoretical principles of economic growth and corporate governance. The study reached the conclusion thata nonlinearrelationship exists between corporate governance and economic growth within the selectedSSA countries during the period under investigation. This study thus recommends that measures should be taken to create conditions for corporate governance that would bolster significant positive contributions to economic growth in the region.

Keywords: corporate governance, economic growth, sub saharan Africa, agency theory, endogenous theory

Procedia PDF Downloads 141
26524 Hierarchical Filtering Method of Threat Alerts Based on Correlation Analysis

Authors: Xudong He, Jian Wang, Jiqiang Liu, Lei Han, Yang Yu, Shaohua Lv

Abstract:

Nowadays, the threats of the internet are enormous and increasing; however, the classification of huge alert messages generated in this environment is relatively monotonous. It affects the accuracy of the network situation assessment, and also brings inconvenience to the security managers to deal with the emergency. In order to deal with potential network threats effectively and provide more effective data to improve the network situation awareness. It is essential to build a hierarchical filtering method to prevent the threats. In this paper, it establishes a model for data monitoring, which can filter systematically from the original data to get the grade of threats and be stored for using again. Firstly, it filters the vulnerable resources, open ports of host devices and services. Then use the entropy theory to calculate the performance changes of the host devices at the time of the threat occurring and filter again. At last, sort the changes of the performance value at the time of threat occurring. Use the alerts and performance data collected in the real network environment to evaluate and analyze. The comparative experimental analysis shows that the threat filtering method can effectively filter the threat alerts effectively.

Keywords: correlation analysis, hierarchical filtering, multisource data, network security

Procedia PDF Downloads 196
26523 Out of Hospital Cardiac Arrest in Kuala Lumpur: A Mixed Method Study on Incidence, Adherence to Protocol, and Issues

Authors: Mohd Said Nurumal, Sarah Sheikh Abdul Karim

Abstract:

Information regarding out of hospital cardiac arrest incidence include outcome in Malaysia is limited and fragmented. This study aims to identify incidence and adherence to protocol of out of hospital cardiac arrest and also to explore the issues faced by the pre-hospital personnel in regards managing cardiac arrest victim in Kuala Lumpur, Malaysia. A mixed method approach combining the qualitative and quantitative study design was used. The 285 pre-hospital care data sheet of out of hospital cardiac arrest during the year of 2011 were examined by using checklists for identify the incidence and adherence to protocol. Nine semi-structured interviews and two focus group discussions were performed. For the incidence based on the overall out of hospital cardiac arrest cases that occurred in 2011 (n=285), the survival rates were 16.8%. For adherence to protocol, only 89 (41.8%) of the cases adhered to the given protocol and 124 did not adhere to such protocol. The qualitative information provided insight about the issues related to out of hospital cardiac arrest in every aspect. All the relevant qualitative data were merged into few categories relating issues that could affect the management of out of hospital cardiac arrest performed by pre-hospital care team. One of the essential elements in the out of hospital cardiac arrest handling by pre-hospital care is to ensure increase of survival rates and excellent outcomes by adhering to given protocols based on international standard benchmarks. Measures are needed to strengthen the quick activation of the pre-hospital care service, prompt bystander cardiopulmonary resuscitation, early defibrillation and timely advanced cardiac life support and also to tackle all the issues highlighted in qualitative results.

Keywords: pre-hospital care, out of hospital cardiac arrest, incidence, protocol, mixed method research

Procedia PDF Downloads 407
26522 Increasing Employee Productivity and Work Well-Being by Employing Affective Decision Support and a Knowledge-Based System

Authors: Loreta Kaklauskiene, Arturas Kaklauskas

Abstract:

This employee productivity and work well-being effective system aims to maximise the work performance of personnel and boost well-being in offices. Affective computing, decision support, and knowledge-based systems were used in our research. The basis of this effective system is our European Patent application (No: EP 4 020 134 A1) and two Lithuanian patents (LT 6841, LT 6866). Our study examines ways to support efficient employee productivity and well-being by employing mass-customised, personalised office environment. Efficient employee performance and well-being are managed by changing mass-customised office environment factors such as air pollution levels, humidity, temperature, data, information, knowledge, activities, lighting colours and intensity, scents, media, games, videos, music, and vibrations. These aspects of management generate a customised, adaptive environment for users taking into account their emotional, affective, and physiological (MAP) states measured and fed into the system. This research aims to develop an innovative method and system which would analyse, customise and manage a personalised office environment according to a specific user’s MAP states in a cohesive manner. Various values of work spaces (e.g., employee utilitarian, hedonic, perceived values) are also established throughout this process, based on the measurements that describe MAP states and other aspects related to the office environment. The main contribution of our research is the development of a real-time mass-customised office environment to boost employee performance and well-being. Acknowledgment: This work was supported by Project No. 2020-1-LT01-KA203-078100 “Minimizing the influence of coronavirus in a built environment” (MICROBE) from the European Union’s Erasmus + program.

Keywords: effective decision support and a knowledge-based system, human resource management, employee productivity and work well-being, affective computing

Procedia PDF Downloads 96
26521 A Tool for Facilitating an Institutional Risk Profile Definition

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for the easy creation of an institutional risk profile for endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support risk factors set up with just the most important values that are important for a particular organisation. Subsequently, the risk profile employs fuzzy models and associated configurations for the file format metadata aggregator to support digital preservation experts with a semi-automatic estimation of endangerment level for file formats. Our goal is to make use of a domain expert knowledge base aggregated from a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation and analysis of risk factors for a requried dimension. The proposed methods improve the visibility of risk factor information and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and automatically aggregated file format metadata from linked open data sources. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.

Keywords: digital information management, file format, endangerment analysis, fuzzy models

Procedia PDF Downloads 397
26520 Carrying Out the Steps of Decision Making Process in Concrete Organization

Authors: Eva Štěpánková

Abstract:

The decision-making process is theoretically clearly defined. Generally, it includes the problem identification and analysis, data gathering, goals and criteria setting, alternatives development and optimal alternative choice and its implementation. In practice however, various modifications of the theoretical decision-making process can occur. The managers can consider some of the phases to be too complicated or unfeasible and thus they do not carry them out and conversely some of the steps can be overestimated. The aim of the paper is to reveal and characterize the perception of the individual phases of decision-making process by the managers. The research is concerned with managers in the military environment–commanders. Quantitative survey is focused cross-sectionally in the individual levels of management of the Ministry of Defence of the Czech Republic. On the total number of 135 respondents the analysis focuses on which of the decision-making process phases are problematic or not carried out in practice and which are again perceived to be the easiest. Then it is examined the reasons of the findings.

Keywords: decision making, decision making process, decision problems, concrete organization

Procedia PDF Downloads 467
26519 A Review of Methods for Handling Missing Data in the Formof Dropouts in Longitudinal Clinical Trials

Authors: A. Satty, H. Mwambi

Abstract:

Much clinical trials data-based research are characterized by the unavoidable problem of dropout as a result of missing or erroneous values. This paper aims to review some of the various techniques to address the dropout problems in longitudinal clinical trials. The fundamental concepts of the patterns and mechanisms of dropout are discussed. This study presents five general techniques for handling dropout: (1) Deletion methods; (2) Imputation-based methods; (3) Data augmentation methods; (4) Likelihood-based methods; and (5) MNAR-based methods. Under each technique, several methods that are commonly used to deal with dropout are presented, including a review of the existing literature in which we examine the effectiveness of these methods in the analysis of incomplete data. Two application examples are presented to study the potential strengths or weaknesses of some of the methods under certain dropout mechanisms as well as to assess the sensitivity of the modelling assumptions.

Keywords: incomplete longitudinal clinical trials, missing at random (MAR), imputation, weighting methods, sensitivity analysis

Procedia PDF Downloads 409