Search results for: Patient record data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27193

Search results for: Patient record data

24343 Optimizing Electric Vehicle Charging with Charging Data Analytics

Authors: Tayyibah Khanam, Mohammad Saad Alam, Sanchari Deb, Yasser Rafat

Abstract:

Electric vehicles are considered as viable replacements to gasoline cars since they help in reducing harmful emissions and stimulate power generation through renewable energy sources, hence contributing to sustainability. However, one of the significant obstacles in the mass deployment of electric vehicles is the charging time anxiety among users and, thus, the subsequent large waiting times for available chargers at charging stations. Data analytics, on the other hand, has revolutionized the decision-making tasks of management and operating systems since its arrival. In this paper, we attempt to optimize the choice of EV charging stations for users in their vicinity by minimizing the time taken to reach the charging stations and the waiting times for available chargers. Time taken to travel to the charging station is calculated by the Google Maps API and the waiting times are predicted by polynomial regression of the historical data stored. The proposed framework utilizes real-time data and historical data from all operating charging stations in the city and assists the user in finding the best suitable charging station for their current situation and can be implemented in a mobile phone application. The algorithm successfully predicts the most optimal choice of a charging station and the minimum required time for various sample data sets.

Keywords: charging data, electric vehicles, machine learning, waiting times

Procedia PDF Downloads 186
24342 Finding Data Envelopment Analysis Targets Using Multi-Objective Programming in DEA-R with Stochastic Data

Authors: R. Shamsi, F. Sharifi

Abstract:

In this paper, we obtain the projection of inefficient units in data envelopment analysis (DEA) in the case of stochastic inputs and outputs using the multi-objective programming (MOP) structure. In some problems, the inputs might be stochastic while the outputs are deterministic, and vice versa. In such cases, we propose a multi-objective DEA-R model because in some cases (e.g., when unnecessary and irrational weights by the BCC model reduce the efficiency score), an efficient decision-making unit (DMU) is introduced as inefficient by the BCC model, whereas the DMU is considered efficient by the DEA-R model. In some other cases, only the ratio of stochastic data may be available (e.g., the ratio of stochastic inputs to stochastic outputs). Thus, we provide a multi-objective DEA model without explicit outputs and prove that the input-oriented MOP DEA-R model in the invariable return to scale case can be replaced by the MOP-DEA model without explicit outputs in the variable return to scale and vice versa. Using the interactive methods for solving the proposed model yields a projection corresponding to the viewpoint of the DM and the analyst, which is nearer to reality and more practical. Finally, an application is provided.

Keywords: DEA-R, multi-objective programming, stochastic data, data envelopment analysis

Procedia PDF Downloads 104
24341 Role of Vision Centers in Eliminating Avoidable Blindness Caused Due to Uncorrected Refractive Error in Rural South India

Authors: Ranitha Guna Selvi D, Ramakrishnan R, Mohideen Abdul Kader

Abstract:

Purpose: To study the role of Vision centers in managing preventable blindness through refractive error correction in Rural South India. Methods: A retrospective analysis of patients attending 15 Vision centers in Rural South India from a period of January 2021 to December 2021 was done. Medical records of 10,85,81 patients both new and reviewed, 79,562 newly registered patients and 29,019 review patient’s from15 Vision centers were included for data analysis. All the patients registered at the vision center underwent basic eye examination, including visual acuity, IOP measurement, Slit-lamp examination, retinoscopy, Fundus examination etc. Results: A total of 1,08,581 patients were included in the study. Of the total 1,08,581 patients, 79,562 were newly registered patients at Vision center and 29,019 were review patients. Males were 52,201(48.1%) and Females were 56,308(51.9) among them. The mean age of all examined patients was 41.03 ± 20.9 years (Standard deviation) and ranged from 01 – 113 years. Presenting mean visual acuity was 0.31 ± 0.5 in the right eye and 0.31 ± 0.4 in the left eye. Of the 1,08,581 patients 22,770 patients had refractive error in right eye and 22,721 patients had uncorrected refractive error in left eye. Glass prescription was given to 17,178 (15.8%) patients. 8,109 (7.5%) patients were referred to the base hospital for specialty clinic expert opinion or for cataract surgery. Conclusion: Vision center utilizing teleconsultation for comprehensive eye screening unit is a very effective tool in reducing the avoidable visual impairment caused due to uncorrected refractive error. Vision Centre model is believed to be efficient as it facilitates early detection and management of uncorrected refractive errors.

Keywords: refractive error, uncorrected refractive error, vision center, vision technician, teleconsultation

Procedia PDF Downloads 138
24340 Integrated Model for Enhancing Data Security Processing Time in Cloud Computing

Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali

Abstract:

Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a simple user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.

Keywords: cloud computing, data security, SAAS, PAAS, IAAS, Blowfish

Procedia PDF Downloads 353
24339 Comparison of Statistical Methods for Estimating Missing Precipitation Data in the River Subbasin Lenguazaque, Colombia

Authors: Miguel Cañon, Darwin Mena, Ivan Cabeza

Abstract:

In this work was compared and evaluated the applicability of statistical methods for the estimation of missing precipitations data in the basin of the river Lenguazaque located in the departments of Cundinamarca and Boyacá, Colombia. The methods used were the method of simple linear regression, distance rate, local averages, mean rates, correlation with nearly stations and multiple regression method. The analysis used to determine the effectiveness of the methods is performed by using three statistical tools, the correlation coefficient (r2), standard error of estimation and the test of agreement of Bland and Altmant. The analysis was performed using real rainfall values removed randomly in each of the seasons and then estimated using the methodologies mentioned to complete the missing data values. So it was determined that the methods with the highest performance and accuracy in the estimation of data according to conditions that were counted are the method of multiple regressions with three nearby stations and a random application scheme supported in the precipitation behavior of related data sets.

Keywords: statistical comparison, precipitation data, river subbasin, Bland and Altmant

Procedia PDF Downloads 464
24338 The Buccal Fat Pad for Closure of Oroantral Communication

Authors: Stefano A. Denes, Riccardo Tieghi, Giovanni Elia

Abstract:

The buccal fat pad is a well-established tool in oral and maxillofacial surgery and its use has proved of value for the closure of oroantral communications. Oroantral communication may be a common complication after sequestrectomy in "Bisphosphonate-related osteonecrosis of the jaws". We report a clinical case of a 70-year-old female patient in bisphosphonate therapy presented with right maxillary sinusitis and oroantral communication after implants insertion. The buccal fat pad was used to close the defect. The case had an uneventful postoperative healing without dehiscence, infection and necrosis. We postulate that the primary closure of the site with buccal fat pad may ensure a sufficient blood supply and adequate protection for an effective bone-healing response to occur.

Keywords: buccal fat pad, oroantral communication, oral surgery, dehiscence

Procedia PDF Downloads 344
24337 Hyperspectral Data Classification Algorithm Based on the Deep Belief and Self-Organizing Neural Network

Authors: Li Qingjian, Li Ke, He Chun, Huang Yong

Abstract:

In this paper, the method of combining the Pohl Seidman's deep belief network with the self-organizing neural network is proposed to classify the target. This method is mainly aimed at the high nonlinearity of the hyperspectral image, the high sample dimension and the difficulty in designing the classifier. The main feature of original data is extracted by deep belief network. In the process of extracting features, adding known labels samples to fine tune the network, enriching the main characteristics. Then, the extracted feature vectors are classified into the self-organizing neural network. This method can effectively reduce the dimensions of data in the spectrum dimension in the preservation of large amounts of raw data information, to solve the traditional clustering and the long training time when labeled samples less deep learning algorithm for training problems, improve the classification accuracy and robustness. Through the data simulation, the results show that the proposed network structure can get a higher classification precision in the case of a small number of known label samples.

Keywords: DBN, SOM, pattern classification, hyperspectral, data compression

Procedia PDF Downloads 338
24336 Assessing Performance of Data Augmentation Techniques for a Convolutional Network Trained for Recognizing Humans in Drone Images

Authors: Masood Varshosaz, Kamyar Hasanpour

Abstract:

In recent years, we have seen growing interest in recognizing humans in drone images for post-disaster search and rescue operations. Deep learning algorithms have shown great promise in this area, but they often require large amounts of labeled data to train the models. To keep the data acquisition cost low, augmentation techniques can be used to create additional data from existing images. There are many techniques of such that can help generate variations of an original image to improve the performance of deep learning algorithms. While data augmentation is potentially assumed to improve the accuracy and robustness of the models, it is important to ensure that the performance gains are not outweighed by the additional computational cost or complexity of implementing the techniques. To this end, it is important to evaluate the impact of data augmentation on the performance of the deep learning models. In this paper, we evaluated the most currently available 2D data augmentation techniques on a standard convolutional network which was trained for recognizing humans in drone images. The techniques include rotation, scaling, random cropping, flipping, shifting, and their combination. The results showed that the augmented models perform 1-3% better compared to a base network. However, as the augmented images only contain the human parts already visible in the original images, a new data augmentation approach is needed to include the invisible parts of the human body. Thus, we suggest a new method that employs simulated 3D human models to generate new data for training the network.

Keywords: human recognition, deep learning, drones, disaster mitigation

Procedia PDF Downloads 88
24335 The Efficiency Analysis in the Health Sector: Marmara Region

Authors: Hale Kirer Silva Lecuna, Beyza Aydin

Abstract:

Health is one of the main components of human capital and sustainable development, and it is very important for economic growth. Health economics, which is an indisputable part of the science of economics, has five stages in general. These are health and development, financing of health services, economic regulation in the health, allocation of resources and efficiency of health services. A well-developed and efficient health sector plays a major role by increasing the level of development of countries. The most crucial pillars of the health sector are the hospitals that are divided into public and private. The main purpose of the hospitals is to provide more efficient services. Therefore the aim is to meet patients’ satisfaction by increasing the service quality. Health-related studies in Turkey date back to the Ottoman and Seljuk Empires. In the near past, Turkey applied 'Health Sector Transformation Programs' under different titles between 2003 and 2010. Our aim in this paper is to measure how effective these transformation programs are for the health sector, to see how much they can increase the efficiency of hospitals over the years, to see the return of investments, to make comments and suggestions on the results, and to provide a new reference for the literature. Within this framework, the public and private hospitals in Balıkesir, Bilecik, Bursa, Çanakkale, Edirne, Istanbul, Kirklareli, Kocaeli, Sakarya, Tekirdağ, Yalova will be examined by using Data Envelopment Analysis (DEA) for the years between 2000 and 2019. DEA is a linear programming-based technique, which gives relatively good results in multivariate studies. DEA basically estimates an efficiency frontier and make a comparison. Constant returns to scale and variable returns to scale are two most commonly used DEA methods. Both models are divided into two as input and output-oriented. To analyze the data, the number of personnel, number of specialist physicians, number of practitioners, number of beds, number of examinations will be used as input variables; and the number of surgeries, in-patient ratio, and crude mortality rate as output variables. 11 hospitals belonging to the Marmara region were included in the study. It is seen that these hospitals worked effectively only in 7 provinces (Balıkesir, Bilecik, Bursa, Edirne, İstanbul, Kırklareli, Yalova) for the year 2001 when no transformation program was implemented. After the transformation program was implemented, for example, in 2014 and 2016, 10 hospitals (Balıkesir, Bilecik, Bursa, Çanakkale, Edirne, İstanbul, Kocaeli, Kırklareli, Tekirdağ, Yalova) were found to be effective. In 2015, ineffective results were observed for Sakarya, Tekirdağ and Yalova. However, since these values are closer to 1 after the transformation program, we can say that the transformation program has positive effects. For Sakarya alone, no effective results have been achieved in any year. When we look at the results in general, it shows that the transformation program has a positive effect on the effectiveness of hospitals.

Keywords: data envelopment analysis, efficiency, health sector, Marmara region

Procedia PDF Downloads 128
24334 Safer Staff: A Survey of Staff Experiences of Violence and Aggression at Work in Coventry and Warwickshire Partnership National Health Service Trust

Authors: Rupinder Kaler, Faith Ndebele, Nadia Saleem, Hafsa Sheikh

Abstract:

Background: Workplace related violence and aggression seems to be considered an acceptable occupational hazard for staff in mental health services. There is literature evidence that healthcare workers in mental health settings are at higher risk from aggression from patients. Aggressive behaviours pose a physical and psychological threat to the psychiatric staff and can result in stress, burnout, sickness, and exhaustion. Further evidence informs that health professionals are the most exposed to psychological disorders such as anxiety, depression and post-traumatic stress disorder. Fear that results from working in a dangerous environment and exhaustion can have a damaging impact on patient care and healthcare relationship. Aim: The aim of this study is to investigate the prevalence and impact of aggressive behaviour on staff working at Coventry and Warwickshire Partnership Trust. Methodology: The study methodology included carrying out a manual, anonymised, multi-disciplinary cross-sectional survey questionnaire across all clinical and non-clinical staff at CWPT from both inpatient and community settings. Findings: The unsurprising finding was that of higher prevalence of aggressive behaviours in in-patients in comparison to community staff. Conclusion: There is a high rate of verbal and physical aggression at work and this has a negative impact on the staff emotional and physical well- being. There is also a higher reliance on colleagues for support on an informal basis than formal organisational support systems. Recommendations: A workforce that is well and functioning is the biggest resource for an organisation. Staff safety during working hours is everyone's responsibility and sits with both individual staff members and the organisation. Post-incident organisational support needs to be consolidated, and hands-on, timely support offered to help maintain emotionally well staff on CWPT. The authors recommend development of preventative and practical protocols for aggression with patient and carer involvement. Post-incident organisational support needs to be consolidated, and hands-on, timely support offered to help maintain emotionally well staff on CWPT.

Keywords: safer staff, survey of staff experiences, violence and aggression, mental health

Procedia PDF Downloads 201
24333 Robotic Logging Technology: The Future of Oil Well Logging

Authors: Nitin Lahkar, Rishiraj Goswami

Abstract:

“Oil Well Logging” or the practice of making a detailed record (a well log) of the geologic formations penetrated by a borehole is an important practice in the Oil and Gas industry. Although a lot of research has been undertaken in this field, some basic limitations still exist. One of the main arenas or venues where plethora of problems arises is in logistically challenged areas. Accessibility and availability of efficient manpower, resources and technology is very time consuming, restricted and often costly in these areas. So, in this regard, the main challenge is to decrease the Non Productive Time (NPT) involved in the conventional logging process. The thought for the solution to this problem has given rise to a revolutionary concept called the “Robotic Logging Technology”. Robotic logging technology promises the advent of successful logging in all kinds of wells and trajectories. It consists of a wireless logging tool controlled from the surface. This eliminates the need for the logging truck to be summoned which in turn saves precious rig time. The robotic logging tool here, is designed such that it can move inside the well by different proposed mechanisms and models listed in the full paper as TYPE A, TYPE B and TYPE C. These types are classified on the basis of their operational technology, movement and conditions/wells in which the tool is to be used. Thus, depending on subsurface conditions, energy sources available and convenience the TYPE of Robotic model will be selected. Advantages over Conventional Logging Techniques: Reduction in Non-Productive time, lesser energy requirements, very fast action as compared to all other forms of logging, can perform well in all kinds of well trajectories (vertical/horizontal/inclined).

Keywords: robotic logging technology, innovation, geology, geophysics

Procedia PDF Downloads 299
24332 Emotional Artificial Intelligence and the Right to Privacy

Authors: Emine Akar

Abstract:

The majority of privacy-related regulation has traditionally focused on concepts that are perceived to be well-understood or easily describable, such as certain categories of data and personal information or images. In the past century, such regulation appeared reasonably suitable for its purposes. However, technologies such as AI, combined with ever-increasing capabilities to collect, process, and store “big data”, not only require calibration of these traditional understandings but may require re-thinking of entire categories of privacy law. In the presentation, it will be explained, against the background of various emerging technologies under the umbrella term “emotional artificial intelligence”, why modern privacy law will need to embrace human emotions as potentially private subject matter. This argument can be made on a jurisprudential level, given that human emotions can plausibly be accommodated within the various concepts that are traditionally regarded as the underlying foundation of privacy protection, such as, for example, dignity, autonomy, and liberal values. However, the practical reasons for regarding human emotions as potentially private subject matter are perhaps more important (and very likely more convincing from the perspective of regulators). In that respect, it should be regarded as alarming that, according to most projections, the usefulness of emotional data to governments and, particularly, private companies will not only lead to radically increased processing and analysing of such data but, concerningly, to an exponential growth in the collection of such data. In light of this, it is also necessity to discuss options for how regulators could address this emerging threat.

Keywords: AI, privacy law, data protection, big data

Procedia PDF Downloads 85
24331 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform

Authors: Reza Mohammadzadeh

Abstract:

The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.

Keywords: data model, geotechnical risks, machine learning, underground coal mining

Procedia PDF Downloads 269
24330 Classification of Poverty Level Data in Indonesia Using the Naïve Bayes Method

Authors: Anung Style Bukhori, Ani Dijah Rahajoe

Abstract:

Poverty poses a significant challenge in Indonesia, requiring an effective analytical approach to understand and address this issue. In this research, we applied the Naïve Bayes classification method to examine and classify poverty data in Indonesia. The main focus is on classifying data using RapidMiner, a powerful data analysis platform. The analysis process involves data splitting to train and test the classification model. First, we collected and prepared a poverty dataset that includes various factors such as education, employment, and health..The experimental results indicate that the Naïve Bayes classification model can provide accurate predictions regarding the risk of poverty. The use of RapidMiner in the analysis process offers flexibility and efficiency in evaluating the model's performance. The classification produces several values to serve as the standard for classifying poverty data in Indonesia using Naive Bayes. The accuracy result obtained is 40.26%, with a moderate recall result of 35.94%, a high recall result of 63.16%, and a low recall result of 38.03%. The precision for the moderate class is 58.97%, for the high class is 17.39%, and for the low class is 58.70%. These results can be seen from the graph below.

Keywords: poverty, classification, naïve bayes, Indonesia

Procedia PDF Downloads 49
24329 Web Search Engine Based Naming Procedure for Independent Topic

Authors: Takahiro Nishigaki, Takashi Onoda

Abstract:

In recent years, the number of document data has been increasing since the spread of the Internet. Many methods have been studied for extracting topics from large document data. We proposed Independent Topic Analysis (ITA) to extract topics independent of each other from large document data such as newspaper data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis. The topic represented by ITA is represented by a set of words. However, the set of words is quite different from the topics the user imagines. For example, the top five words with high independence of a topic are as follows. Topic1 = {"scor", "game", "lead", "quarter", "rebound"}. This Topic 1 is considered to represent the topic of "SPORTS". This topic name "SPORTS" has to be attached by the user. ITA cannot name topics. Therefore, in this research, we propose a method to obtain topics easy for people to understand by using the web search engine, topics given by the set of words given by independent topic analysis. In particular, we search a set of topical words, and the title of the homepage of the search result is taken as the topic name. And we also use the proposed method for some data and verify its effectiveness.

Keywords: independent topic analysis, topic extraction, topic naming, web search engine

Procedia PDF Downloads 116
24328 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas

Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards

Abstract:

Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.

Keywords: airborne laser scanning, digital terrain models, filtering, forested areas

Procedia PDF Downloads 135
24327 Estimating the Life-Distribution Parameters of Weibull-Life PV Systems Utilizing Non-Parametric Analysis

Authors: Saleem Z. Ramadan

Abstract:

In this paper, a model is proposed to determine the life distribution parameters of the useful life region for the PV system utilizing a combination of non-parametric and linear regression analysis for the failure data of these systems. Results showed that this method is dependable for analyzing failure time data for such reliable systems when the data is scarce.

Keywords: masking, bathtub model, reliability, non-parametric analysis, useful life

Procedia PDF Downloads 556
24326 Preliminary Design of Maritime Energy Management System: Naval Architectural Approach to Resolve Recent Limitations

Authors: Seyong Jeong, Jinmo Park, Jinhyoun Park, Boram Kim, Kyoungsoo Ahn

Abstract:

Energy management in the maritime industry is being required by economics and in conformity with new legislative actions taken by the International Maritime Organization (IMO) and the European Union (EU). In response, the various performance monitoring methodologies and data collection practices have been examined by different stakeholders. While many assorted advancements in operation and technology are applicable, their adoption in the shipping industry stays small. This slow uptake can be considered due to many different barriers such as data analysis problems, misreported data, and feedback problems, etc. This study presents a conceptual design of an energy management system (EMS) and proposes the methodology to resolve the limitations (e.g., data normalization using naval architectural evaluation, management of misrepresented data, and feedback from shore to ship through management of performance analysis history). We expect this system to make even short-term charterers assess the ship performance properly and implement sustainable fleet control.

Keywords: data normalization, energy management system, naval architectural evaluation, ship performance analysis

Procedia PDF Downloads 445
24325 Kinoform Optimisation Using Gerchberg- Saxton Iterative Algorithm

Authors: M. Al-Shamery, R. Young, P. Birch, C. Chatwin

Abstract:

Computer Generated Holography (CGH) is employed to create digitally defined coherent wavefronts. A CGH can be created by using different techniques such as by using a detour-phase technique or by direct phase modulation to create a kinoform. The detour-phase technique was one of the first techniques that was used to generate holograms digitally. The disadvantage of this technique is that the reconstructed image often has poor quality due to the limited dynamic range it is possible to record using a medium with reasonable spatial resolution.. The kinoform (phase-only hologram) is an alternative technique. In this method, the phase of the original wavefront is recorded but the amplitude is constrained to be constant. The original object does not need to exist physically and so the kinoform can be used to reconstruct an almost arbitrary wavefront. However, the image reconstructed by this technique contains high levels of noise and is not identical to the reference image. To improve the reconstruction quality of the kinoform, iterative techniques such as the Gerchberg-Saxton algorithm (GS) are employed. In this paper the GS algorithm is described for the optimisation of a kinoform used for the reconstruction of a complex wavefront. Iterations of the GS algorithm are applied to determine the phase at a plane (with known amplitude distribution which is often taken as uniform), that satisfies given phase and amplitude constraints in a corresponding Fourier plane. The GS algorithm can be used in this way to enhance the reconstruction quality of the kinoform. Different images are employed as the reference object and their kinoform is synthesised using the GS algorithm. The quality of the reconstructed images is quantified to demonstrate the enhanced reconstruction quality achieved by using this method.

Keywords: computer generated holography, digital holography, Gerchberg-Saxton algorithm, kinoform

Procedia PDF Downloads 528
24324 A Study in the Formation of a Term: Sahaba

Authors: Abdul Rahman Chamseddine

Abstract:

The Companions of the Prophet Muhammad, the Sahaba, are regarded as the first link between him and later believers who did not know him or learn from him directly. This makes the Sahaba a link in the chain between God and the ummah (community). Apart from their role in spreading the Prophet’s teachings, they came to be regarded as role models, representing the Islamic ideal of life as prescribed by the Prophet himself. According to Hadith, the Prophet had promised some Sahaba unqualified admission to paradise. It is commonly agreed that the Sahaba have the following attributes in common: God is well pleased with them; they will surely go to paradise; they are perfectly trustworthy; and they are the authorities from whom Muslims can learn all matters related to their religion. No other generation of Muslims has received the attention received by the Companions of the Prophet. In spite of the importance of the Sahaba in Islam, we still know comparatively little about them. There are at least two reasons for this. First, there is the overall scarcity of information surviving from the early period. At the death of the Prophet, it is said, there were more than 100,000 Companions. As we shall see, this is a complex issue, involving the definition of the term Sahaba. However, only few Companions of the Prophet are known to us. Ibn Hajar al-‘Asqalani, who wrote in the fifteenth century A.D., was only able to collect facts about 11,000 of them (including those whose status as Sahaba was disputed). Ibn Sa‘d, Ibn ‘Abd al-Barr and Ibn al-Athir, all of whom lived earlier than Ibn Hajar, included in their respective works fewer lives of Sahaba than he did. If we consider Ibn Hajar’s Isaba as the most complete biographical account of the Sahaba that remains available, we have information, presumably, on approximately one tenth of them. The remaining nine tenths are apparently lost from the historical record. Second, discussion of the Sahaba tends to focus on those considered the most important among them such as ‘Uthman, ‘Ali and Mu‘awiya, while others, who together number in the thousands, are less well-known. This paper will try to study the origins of the term Sahaba that became exclusive to the Companions of the Prophet and not a synonym of the word companions in general.

Keywords: companions, Hadith, Islamic history, Muhammad, Sahaba, transmission

Procedia PDF Downloads 411
24323 Geospatial Data Complexity in Electronic Airport Layout Plan

Authors: Shyam Parhi

Abstract:

Airports GIS program collects Airports data, validate and verify it, and stores it in specific database. Airports GIS allows authorized users to submit changes to airport data. The verified data is used to develop several engineering applications. One of these applications is electronic Airport Layout Plan (eALP) whose primary aim is to move from paper to digital form of ALP. The first phase of development of eALP was completed recently and it was tested for a few pilot program airports across different regions. We conducted gap analysis and noticed that a lot of development work is needed to fine tune at least six mandatory sheets of eALP. It is important to note that significant amount of programming is needed to move from out-of-box ArcGIS to a much customized ArcGIS which will be discussed. The ArcGIS viewer capability to display essential features like runway or taxiway or the perpendicular distance between them will be discussed. An enterprise level workflow which incorporates coordination process among different lines of business will be highlighted.

Keywords: geospatial data, geology, geographic information systems, aviation

Procedia PDF Downloads 414
24322 Anisotropic Total Fractional Order Variation Model in Seismic Data Denoising

Authors: Jianwei Ma, Diriba Gemechu

Abstract:

In seismic data processing, attenuation of random noise is the basic step to improve quality of data for further application of seismic data in exploration and development in different gas and oil industries. The signal-to-noise ratio of the data also highly determines quality of seismic data. This factor affects the reliability as well as the accuracy of seismic signal during interpretation for different purposes in different companies. To use seismic data for further application and interpretation, we need to improve the signal-to-noise ration while attenuating random noise effectively. To improve the signal-to-noise ration and attenuating seismic random noise by preserving important features and information about seismic signals, we introduce the concept of anisotropic total fractional order denoising algorithm. The anisotropic total fractional order variation model defined in fractional order bounded variation is proposed as a regularization in seismic denoising. The split Bregman algorithm is employed to solve the minimization problem of the anisotropic total fractional order variation model and the corresponding denoising algorithm for the proposed method is derived. We test the effectiveness of theproposed method for synthetic and real seismic data sets and the denoised result is compared with F-X deconvolution and non-local means denoising algorithm.

Keywords: anisotropic total fractional order variation, fractional order bounded variation, seismic random noise attenuation, split Bregman algorithm

Procedia PDF Downloads 205
24321 Noise Pollution in Nigerian Cities: Case Study of Bida, Nigeria

Authors: Funke Morenike Jiyah, Joshua Jiyah

Abstract:

The occurrence of various health issues have been linked to excessive noise pollution in all works of life as evident in many research efforts. This study provides empirical analysis of the effects of noise pollution on the well-being of the residents of Bida Local Government Area, Niger State, Nigeria. The study adopted a case study research design, involving cross-sectional procedure. Field observations and medical reports were obtained to support the respondents’ perception on the state of their well-being. The sample size for the study was selected using the housing stock in the various wards. One major street in each ward was selected. A total of 1,833 buildings were counted along the sampled streets and 10% of this was selected for the administration of structured questionnaire.The environmental quality of the wards was determined by measuring the noise level using Testo 815 noise meters. The result revealed that Bariki ward which houses the GRA has the lowest noise level of 37.8 dB(A)while the noise pollution levels recorded in the other thirteen wards were all above the recommended levels. The average ambient noise level in sawmills, commercial centres, road junctions and industrial areas were above 90 dB(A). The temporal record from the Federal Medical Centre, Bida revealed that, apart from malaria, hypertension (5,614 outpatients) was the most prevalent health issue in 2013 alone. The paper emphasised the need for compatibility consideration in the choice of residential location, the use of ear muffler and effective enforcement of zoning regulations.

Keywords: bida, decibels, environmental quality, noise, well-being

Procedia PDF Downloads 130
24320 NSBS: Design of a Network Storage Backup System

Authors: Xinyan Zhang, Zhipeng Tan, Shan Fan

Abstract:

The first layer of defense against data loss is the backup data. This paper implements an agent-based network backup system used the backup, server-storage and server-backup agent these tripartite construction, and we realize the snapshot and hierarchical index in the NSBS. It realizes the control command and data flow separation, balances the system load, thereby improving the efficiency of the system backup and recovery. The test results show the agent-based network backup system can effectively improve the task-based concurrency, reasonably allocate network bandwidth, the system backup performance loss costs smaller and improves data recovery efficiency by 20%.

Keywords: agent, network backup system, three architecture model, NSBS

Procedia PDF Downloads 455
24319 A t-SNE and UMAP Based Neural Network Image Classification Algorithm

Authors: Shelby Simpson, William Stanley, Namir Naba, Xiaodi Wang

Abstract:

Both t-SNE and UMAP are brand new state of art tools to predominantly preserve the local structure that is to group neighboring data points together, which indeed provides a very informative visualization of heterogeneity in our data. In this research, we develop a t-SNE and UMAP base neural network image classification algorithm to embed the original dataset to a corresponding low dimensional dataset as a preprocessing step, then use this embedded database as input to our specially designed neural network classifier for image classification. We use the fashion MNIST data set, which is a labeled data set of images of clothing objects in our experiments. t-SNE and UMAP are used for dimensionality reduction of the data set and thus produce low dimensional embeddings. Furthermore, we use the embeddings from t-SNE and UMAP to feed into two neural networks. The accuracy of the models from the two neural networks is then compared to a dense neural network that does not use embedding as an input to show which model can classify the images of clothing objects more accurately.

Keywords: t-SNE, UMAP, fashion MNIST, neural networks

Procedia PDF Downloads 193
24318 An Online Adaptive Thresholding Method to Classify Google Trends Data Anomalies for Investor Sentiment Analysis

Authors: Duygu Dere, Mert Ergeneci, Kaan Gokcesu

Abstract:

Google Trends data has gained increasing popularity in the applications of behavioral finance, decision science and risk management. Because of Google’s wide range of use, the Trends statistics provide significant information about the investor sentiment and intention, which can be used as decisive factors for corporate and risk management fields. However, an anomaly, a significant increase or decrease, in a certain query cannot be detected by the state of the art applications of computation due to the random baseline noise of the Trends data, which is modelled as an Additive white Gaussian noise (AWGN). Since through time, the baseline noise power shows a gradual change an adaptive thresholding method is required to track and learn the baseline noise for a correct classification. To this end, we introduce an online method to classify meaningful deviations in Google Trends data. Through extensive experiments, we demonstrate that our method can successfully classify various anomalies for plenty of different data.

Keywords: adaptive data processing, behavioral finance , convex optimization, online learning, soft minimum thresholding

Procedia PDF Downloads 162
24317 Prevalence and Risk Factors of Musculoskeletal Disorders among Physical Therapist's Seniors versus Internship Students

Authors: A. H. Bekhet, N. Helmy

Abstract:

Background: Physical therapists are knowledgeable in treatment and prevention of musculoskeletal injuries; however, they have occupational musculoskeletal injuries because Physical therapy profession requires effort that may lead to work-related musculoskeletal disorders. No previous studies among physical therapists have been reported in Egypt. We aim to assess the prevalence and risk factors of musculoskeletal disorders among physical therapist’s seniors versus internship students. Method: We conducted a cross-sectional study in faculty of physical therapy Cairo university Prevalence and risk factors of musculoskeletal injuries were assessed using self-administered questionnaire with closed-ended questions. Seniors therapist was defined as a physical therapist with more than 5 years of work experience. Data were analyzed using SPSS 22.0 for Windows. Results: The study included 106 physical therapists (Junior = 72; senior = 34), the mean age of senior therapists was 30.1 (SD 6.3) years and junior therapists were 22.8 (SD 2.4). Female subjects constituted 83.9% of the studied sample. The mean hours of contact with patients was higher among junior therapists 6.4 (SD 2.6) vs. 5.7 (SD 2.1) among senior therapists. The prevalence of a musculoskeletal injury, once or more in their lifetime, was significantly higher among senior therapists (86% vs. 66.7%; p = 0.04). The highest risk factor in increasing the symptoms of the injury among junior therapists was maintaining a position for a prolonged period of time at 28% while performing manual therapy techniques was the highest risk factor among senior therapists at 32%. 53% of senior therapists have limited their patient contact time as a result of their injury in comparison to 25% of junior therapists (p = 0.09). Conclusion: the presented study shows that the prevalence of musculoskeletal injuries, once or more in their lifetime, is significantly higher among senior therapists.

Keywords: musculoskeletal injuries, occupational injuries, physical therapists, work related disorders

Procedia PDF Downloads 287
24316 Energy Efficient Assessment of Energy Internet Based on Data-Driven Fuzzy Integrated Cloud Evaluation Algorithm

Authors: Chuanbo Xu, Xinying Li, Gejirifu De, Yunna Wu

Abstract:

Energy Internet (EI) is a new form that deeply integrates the Internet and the entire energy process from production to consumption. The assessment of energy efficient performance is of vital importance for the long-term sustainable development of EI project. Although the newly proposed fuzzy integrated cloud evaluation algorithm considers the randomness of uncertainty, it relies too much on the experience and knowledge of experts. Fortunately, the enrichment of EI data has enabled the utilization of data-driven methods. Therefore, the main purpose of this work is to assess the energy efficient of park-level EI by using a combination of a data-driven method with the fuzzy integrated cloud evaluation algorithm. Firstly, the indicators for the energy efficient are identified through literature review. Secondly, the artificial neural network (ANN)-based data-driven method is employed to cluster the values of indicators. Thirdly, the energy efficient of EI project is calculated through the fuzzy integrated cloud evaluation algorithm. Finally, the applicability of the proposed method is demonstrated by a case study.

Keywords: energy efficient, energy internet, data-driven, fuzzy integrated evaluation, cloud model

Procedia PDF Downloads 197
24315 The Role of Dynamic Ankle Foot Orthosis on Temporo-Spatial Parameters of Gait and Balance in Patients with Hereditary Spastic Paraparesis: Six-Months Follow Up

Authors: Suat Erel, Gozde Gur

Abstract:

Background: Recently a supramalleolar type of dynamic ankle foot orthosis (DAFO) has been increasingly used to support all of the dynamic arches of the foot and redistribute the pressure under the plantar surface of the foot to reduce the muscle tone. DAFO helps to maintain balance and postural control by providing stability and proprioceptive feedback in children with disease like Cerebral Palsy, Muscular Dystrophies, Down syndrome, and congenital hypotonia. Aim: The aim of this study was to investigate the role of Dynamic ankle foot orthosis (DAFO) on temporo-spatial parameters of gait and balance in three children with hereditary spastic paraparesis (HSP). Material Method: 13, 14, and 8 years old three children with HSP were included in the study. To provide correction on weight bearing and to improve gait, DAFO was made. Lower extremity spasticity (including gastocnemius, hamstrings and hip adductor muscles) using modified Ashworth Scale (MAS) (0-5), The temporo-spatial gait parameters (walking speed, cadence, base of support, step length) and Timed Up & Go test (TUG) were evaluated. All of the assessments about gait were compared with (with DAFO and shoes) and without DAFO (with shoes only) situations. Also after six months follow up period, assessments were repeated by the same physical therapist. Results: MAS scores for lower extremity were between “2-3” for the first child, “0-2” for the second child and “1-2” for the third child. TUG scores (sec) decreased from 20.2 to 18 for case one, from 9.4 to 9 for case two and from 12,4 to 12 for case three in the condition with shoes only and also from 15,2 to 14 for case one, from 7,2 to 7,1 for case two and from 10 to 7,3 for case three in the condition with DAFO and shoes. Gait speed (m/sec) while wearing shoes only was similar but while wearing DAFO and shoes increased from 0,4 to 0,5 for case one, from 1,5 to 1,6 for case two and from 1,0 to 1,2 for case three. Base of support scores (cm) wearing shoes only decreased from 18,5 to 14 for case one, from 13 to 12 for case three and were similar as 11 for case two. While wearing DAFO and shoes, base of support decreased from 10 to 9 for case one, from 11,5 to 10 for case three and was similar as 8 for case two. Conclusion: The use of a DAFO in a patient with HSP normalized the temporo-spatial gait parameters and improved balance. Walking speed is a gold standard for evaluating gait quality. With the use of DAFO, walking speed increased in this three children with HSP. With DAFO, better TUG scores shows that functional ambulation improved. Reduction in base of support and more symmetrical step lengths with DAFO indicated better balance. These encouraging results warrant further study on wider series.

Keywords: dynamic ankle foot orthosis, gait, hereditary spastic paraparesis, balance in patient

Procedia PDF Downloads 351
24314 Does Clinical Guidelines Affect Healthcare Quality and Populational Health: Quebec Colorectal Cancer Screening Program

Authors: Nizar Ghali, Bernard Fortin, Guy Lacroix

Abstract:

In Quebec, colonoscopies volumes have continued to rise in recent years in the absence of effective monitoring mechanism for the appropriateness and the quality of these exams. In 2010, November, Quebec Government introduced the colorectal cancer-screening program in the objective to control for volume and cost imperfection. This program is based on clinical standards and was initiated for first group of institutions. One year later, Government adds financial incentives for participants institutions. In this analysis, we want to assess for the causal effect of the two components of this program: clinical pathways and financial incentives. Especially we assess for the reform effect on healthcare quality and population health in the context that medical remuneration is not directly dependent on this additional funding offered by the program. We have data on admissions episodes and deaths for 8 years. We use multistate model analog to difference in difference approach to estimate reform effect on the transition probability between different states for each patient. Our results show that the reform reduced length of stay without deterioration in hospital mortality or readmission rate. In the other hand, the program contributed to decrease the hospitalization rate and a less invasive treatment approach for colorectal surgeries. This is a sign of healthcare quality and population health improvement. We demonstrate in this analysis that physicians’ behavior can be affected by both clinical standards and financial incentives even if offered to facilities.

Keywords: multi-state and multi-episode transition model, healthcare quality, length of stay, transition probability, difference in difference

Procedia PDF Downloads 213