Search results for: Data Assimilation
5846 Fuzzy Fingerprint Vault using Multiple Polynomials
Authors: Daesung Moon, Woo-Yong Choi, Kiyoung Moon
Abstract:
Fuzzy fingerprint vault is a recently developed cryptographic construct based on the polynomial reconstruction problem to secure critical data with the fingerprint data. However, the previous researches are not applicable to the fingerprint having a few minutiae since they use a fixed degree of the polynomial without considering the number of fingerprint minutiae. To solve this problem, we use an adaptive degree of the polynomial considering the number of minutiae extracted from each user. Also, we apply multiple polynomials to avoid the possible degradation of the security of a simple solution(i.e., using a low-degree polynomial). Based on the experimental results, our method can make the possible attack difficult 2192 times more than using a low-degree polynomial as well as verify the users having a few minutiae.
Keywords: Fuzzy vault, fingerprint recognition multiple polynomials.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15555845 An Investigative Study into Observer based Non-Invasive Fault Detection and Diagnosis in Induction Motors
Authors: Padmakumar S., Vivek Agarwal, Kallol Roy
Abstract:
A new observer based fault detection and diagnosis scheme for predicting induction motors- faults is proposed in this paper. Prediction of incipient faults, using different variants of Kalman filter and their relative performance are evaluated. Only soft faults are considered for this work. The data generation, filter convergence issues, hypothesis testing and residue estimates are addressed. Simulink model is used for data generation and various types of faults are considered. A comparative assessment of the estimates of different observers associated with these faults is included.Keywords: Extended Kalman Filter, Fault detection and diagnosis, Induction motor model, Unscented Kalman Filter
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18875844 Data-driven Multiscale Tsallis Complexity: Application to EEG Analysis
Authors: Young-Seok Choi
Abstract:
This work proposes a data-driven multiscale based quantitative measures to reveal the underlying complexity of electroencephalogram (EEG), applying to a rodent model of hypoxic-ischemic brain injury and recovery. Motivated by that real EEG recording is nonlinear and non-stationary over different frequencies or scales, there is a need of more suitable approach over the conventional single scale based tools for analyzing the EEG data. Here, we present a new framework of complexity measures considering changing dynamics over multiple oscillatory scales. The proposed multiscale complexity is obtained by calculating entropies of the probability distributions of the intrinsic mode functions extracted by the empirical mode decomposition (EMD) of EEG. To quantify EEG recording of a rat model of hypoxic-ischemic brain injury following cardiac arrest, the multiscale version of Tsallis entropy is examined. To validate the proposed complexity measure, actual EEG recordings from rats (n=9) experiencing 7 min cardiac arrest followed by resuscitation were analyzed. Experimental results demonstrate that the use of the multiscale Tsallis entropy leads to better discrimination of the injury levels and improved correlations with the neurological deficit evaluation after 72 hours after cardiac arrest, thus suggesting an effective metric as a prognostic tool.
Keywords: Electroencephalogram (EEG), multiscale complexity, empirical mode decomposition, Tsallis entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20655843 Sequential Partitioning Brainbow Image Segmentation Using Bayesian
Authors: Yayun Hsu, Henry Horng-Shing Lu
Abstract:
This paper proposes a data-driven, biology-inspired neural segmentation method of 3D drosophila Brainbow images. We use Bayesian Sequential Partitioning algorithm for probabilistic modeling, which can be used to detect somas and to eliminate crosstalk effects. This work attempts to develop an automatic methodology for neuron image segmentation, which nowadays still lacks a complete solution due to the complexity of the image. The proposed method does not need any predetermined, risk-prone thresholds, since biological information is inherently included inside the image processing procedure. Therefore, it is less sensitive to variations in neuron morphology; meanwhile, its flexibility would be beneficial for tracing the intertwining structure of neurons.
Keywords: Brainbow, 3D imaging, image segmentation, neuron morphology, biological data mining, non-parametric learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22665842 Nonlinear Multivariable Analysis of CO2 Emissions in China
Authors: Hsiao-Tien Pao, Yi-Ying Li, Hsin-Chia Fu
Abstract:
This paper addressed the impacts of energy consumption, economic growth, financial development, and population size on environmental degradation using grey relational analysis (GRA) for China, where foreign direct investment (FDI) inflows is the proxy variable for financial development. The more recent historical data during the period 2004–2011 are used, because the use of very old data for data analysis may not be suitable for rapidly developing countries. The results of the GRA indicate that the linkage effects of energy consumption–emissions and GDP–emissions are ranked first and second, respectively. These reveal that energy consumption and economic growth are strongly correlated with emissions. Higher economic growth requires more energy consumption and increasing environmental pollution. Likewise, more efficient energy use needs a higher level of economic development. Therefore, policies to improve energy efficiency and create a low-carbon economy can reduce emissions without hurting economic growth. The finding of FDI–emissions linkage is ranked third. This indicates that China do not apply weak environmental regulations to attract inward FDI. Furthermore, China’s government in attracting inward FDI should strengthen environmental policy. The finding of population–emissions linkage effect is ranked fourth, implying that population size does not directly affect CO2 emissions, even though China has the world’s largest population, and Chinese people are very economical use of energy-related products. Overall, the energy conservation, improving efficiency, managing demand, and financial development, which aim at curtailing waste of energy, reducing both energy consumption and emissions, and without loss of the country’s competitiveness, can be adopted for developing economies. The GRA is one of the best way to use a lower data to build a dynamic analysis model.
Keywords: Grey relational analysis, foreign direct investment, CO2 emissions, China.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12795841 Hotel Design and Energy Consumption
Authors: Bin Su
Abstract:
A hotel mainly uses its energy on water heating, space heating, refrigeration, space cooling, cooking, lighting and other building services. A number of 4-5 stars hotels in Auckland city are selected for this study. Comparing with the energy used for others, the energy used for the internal space thermal control (e.g. internal space heating) is more closely related to the hotel building itself. This study not only investigates relationship between annual energy (and winter energy) consumptions and building design data but also relationships between winter extra energy consumption and building design data. This study is to identify the major design factors that significantly impact hotel energy consumption for improving the future hotel design for energy efficient.Keywords: Hotel building design, building energy, building passive design, energy efficiency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 79845840 Peak Data Rate Enhancement Using Switched Micro-Macro Diversity in Cellular Multiple-Input-Multiple-Output Systems
Authors: Jihad S. Daba, J. P. Dubois, Yvette Antar
Abstract:
With the exponential growth of cellular users, a new generation of cellular networks is needed to enhance the required peak data rates. The co-channel interference between neighboring base stations inhibits peak data rate increase. To overcome this interference, multi-cell cooperation known as coordinated multipoint transmission is proposed. Such a solution makes use of multiple-input-multiple-output (MIMO) systems under two different structures: Micro- and macro-diversity. In this paper, we study the capacity and bit error rate in cellular networks using MIMO technology. We analyse both micro- and macro-diversity schemes and develop a hybrid model that switches between macro- and micro-diversity in the case of hard handoff based on a cut-off range of signal-to-noise ratio values. We conclude that our hybrid switched micro-macro MIMO system outperforms classical MIMO systems at the cost of increased hardware and software complexity.
Keywords: Cooperative multipoint transmission, ergodic capacity, hard handoff, macro-diversity, micro-diversity, multiple-input-multiple-output systems, MIMO, orthogonal frequency division multiplexing, OFDM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10975839 Resettlement and Livelihood Sustainability in Sub-Saharan Africa: The Case of Bui Hydro-Power Dam Project, Ghana
Authors: Francis Z. Naab, Abraham M. Nunbogu, Romanus D. Dinye, Alfred Dongzagla
Abstract:
The study assesses the effectiveness of the Bui Dam resettlement scheme in the Tain and the Bole districts in Ghana. The study adopted a mixed approach in its data collection and analyses. Of the eight communities affected by Bui hydropower project, and thus require resettlement, four were purposively selected for primary data collection. Primary data was gathered through questionnaire administration to 157 heads of resettled households, focus group discussions with men and women and in-depth interviews with key informants. The findings indicated that the affected people had been sufficiently contacted at all levels of their resettlement. In particular, the Ghana Dams Dialogue, which served as a liaison entity between the government and the resettlement communities came up for praise for its usefulness. Many tangible policies were put in place to address the socio-cultural differences of traditional authorities. The Bui Dam Authority also rigorously followed national and international laws and protocols in the design and implementation of the resettlement scheme. In assessing the effectiveness of the resettlement scheme, it was clear that there had been a great appreciation of the compensation regarding infrastructural development, but much more would have to be done to satisfy livelihood empowerment requirements. It was recommended that candid efforts be made to restore the lost identities of the communities resettled, and more dialogue is encouraged among communities living together.
Keywords: Resettlement, livelihood, hydro-power project, Bui Dam, Ghana.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14185838 A Study on Method for Identifying Capacity Factor Declination of Wind Turbines
Authors: Dongheon Shin, Kyungnam Ko, Jongchul Huh
Abstract:
The investigation on wind turbine degradation was carried out using the nacelle wind data. The three Vestas V80-2MW wind turbines of Sungsan wind farm in Jeju Island, South Korea were selected for this work. The SCADA data of the wind farm for five years were analyzed to draw power curve of the turbines. It is assumed that the wind distribution is the Rayleigh distribution to calculate the normalized capacity factor based on the drawn power curve of the three wind turbines for each year. The result showed that the reduction of power output from the three wind turbines occurred every year and the normalized capacity factor decreased to 0.12%/year on average.
Keywords: Wind energy, Power curve, Capacity factor, Annual energy production.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29585837 Characteristics of Hydraulic Jump
Authors: Sumit Gandhi
Abstract:
The effect of an abruptly expanding channel on the main characteristics of hydraulic jump is considered experimentally. The present study was made for supercritical flow of Froude number varying between 2 to 9 and approach to expanded channel width ratios 0.4, 0.5, 0.6 and 0.8. Physical explanations of the variation of these characteristics under varying flow conditions are discussed based on the observation drawn from experimental results. The analytical equation for the sequent depth ratio in an abruptly expanding channel as given by eminent hydraulic engineers are verified well with the experimental data for all expansion ratios, and the empirical relation was also verified with the present experimental data.
Keywords: Abruptly Expanding Channel, Hydraulic Jump, Efficiency, Sequent Depth Ratio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42985836 Frequency and Amplitude Measurement of a Vibrating Object in Water Using Ultrasonic Speckle Technique
Authors: Hongmao Zhu, Jun Chu, Lei Shen, Zhihua Luo
Abstract:
The principle of frequency and amplitude measurement of a vibrating object in water using ultrasonic speckle technique is presented in this paper. Compared with other traditional techniques, the ultrasonic speckle technique can be applied to vibration measurement of a nonmetal object with rough surface in water in a noncontact way. The relationship between speckle movement and object movement was analyzed. Based on this study, an ultrasonic speckle measurement system was set up. With this system the frequency and amplitude of an underwater vibrating cantilever beam was detected. The result shows that the experimental data is in good agreement with the calibrating data.
Keywords: Frequency, Amplitude, Vibration measurement, Ultrasonic speckle
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15115835 Straight Line Defect Detection with Feed Forward Neural Network
Authors: S. Liangwongsan, A. Oonsivilai
Abstract:
Nowadays, hard disk is one of the most popular storage components. In hard disk industry, the hard disk drive must pass various complex processes and tested systems. In each step, there are some failures. To reduce waste from these failures, we must find the root cause of those failures. Conventionall data analysis method is not effective enough to analyze the large capacity of data. In this paper, we proposed the Hough method for straight line detection that helps to detect straight line defect patterns that occurs in hard disk drive. The proposed method will help to increase more speed and accuracy in failure analysis.
Keywords: Hough Transform, Failure Analysis, Media, Hard Disk Drive
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20965834 Turbine Trip without Bypass Analysis of Kuosheng Nuclear Power Plant Using TRACE Coupling with FRAPTRAN
Authors: J. R. Wang, H. T. Lin, H. C. Chang, W. K. Lin, W. Y. Li, C. Shih
Abstract:
This analysis of Kuosheng nuclear power plant (NPP) was performed mainly by TRACE, assisted with FRAPTRAN and FRAPCON. SNAP v2.2.1 and TRACE v5.0p3 are used to develop the Kuosheng NPP SPU TRACE model which can simulate the turbine trip without bypass transient. From the analysis of TRACE, the important parameters such as dome pressure, coolant temperature and pressure can be determined. Through these parameters, comparing with the criteria which were formulated by United States Nuclear Regulatory Commission (U.S. NRC), we can determine whether the Kuoshengnuclear power plant failed or not in the accident analysis. However, from the data of TRACE, the fuel rods status cannot be determined. With the information from TRACE and burn-up analysis obtained from FRAPCON, FRAPTRAN analyzes more details about the fuel rods in this transient. Besides, through the SNAP interface, the data results can be presented as an animation. From the animation, the TRACE and FRAPTRAN data can be merged together that may be realized by the readers more easily. In this research, TRACE showed that the maximum dome pressure of the reactor reaches to 8.32 MPa, which is lower than the acceptance limit 9.58 MPa. Furthermore, FRAPTRAN revels that the maximum strain is about 0.00165, which is below the criteria 0.01. In addition, cladding enthalpy is 52.44 cal/g which is lower than 170 cal/g specified by the USNRC NUREG-0800 Standard Review Plan.
Keywords: Turbine trip without bypass, Kuosheng NPP, TRACE, FRAPTRAN, SNAP animation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24905833 Determining the Direction of Causality between Creating Innovation and Technology Market
Authors: Liubov Evstigneeva
Abstract:
In this paper an attempt is made to establish causal nexuses between innovation and international trade in Russia. The topicality of this issue is determined by the necessity of choosing policy instruments for economic modernization and transition to innovative development. The vector auto regression (VAR) model and Granger test are applied for the Russian monthly data from 2005 until the second quartile of 2015. Both lagged import and export at the national level cause innovation, the latter starts to stimulate foreign trade since it is a remote lag. In comparison to aggregate data, the results by patent’s categories are more diverse. Importing technologies from foreign countries stimulates patent activity, while innovations created in Russia are only Granger causality for import to Commonwealth of Independent States.
Keywords: Export, import, innovation, patents.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12015832 Effect of Clustering on Energy Efficiency and Network Lifetime in Wireless Sensor Networks
Authors: Prakash G L, Chaitra K Meti, Poojitha K, Divya R.K.
Abstract:
Wireless Sensor Network is Multi hop Self-configuring Wireless Network consisting of sensor nodes. The deployment of wireless sensor networks in many application areas, e.g., aggregation services, requires self-organization of the network nodes into clusters. Efficient way to enhance the lifetime of the system is to partition the network into distinct clusters with a high energy node as cluster head. The different methods of node clustering techniques have appeared in the literature, and roughly fall into two families; those based on the construction of a dominating set and those which are based solely on energy considerations. Energy optimized cluster formation for a set of randomly scattered wireless sensors is presented. Sensors within a cluster are expected to be communicating with cluster head only. The energy constraint and limited computing resources of the sensor nodes present the major challenges in gathering the data. In this paper we propose a framework to study how partially correlated data affect the performance of clustering algorithms. The total energy consumption and network lifetime can be analyzed by combining random geometry techniques and rate distortion theory. We also present the relation between compression distortion and data correlation.Keywords: Clusters, multi hop, random geometry, rate distortion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16395831 Designing Ontology-Based Knowledge Integration for Preprocessing of Medical Data in Enhancing a Machine Learning System for Coding Assignment of a Multi-Label Medical Text
Authors: Phanu Waraporn
Abstract:
This paper discusses the designing of knowledge integration of clinical information extracted from distributed medical ontologies in order to ameliorate a machine learning-based multilabel coding assignment system. The proposed approach is implemented using a decision tree technique of the machine learning on the university hospital data for patients with Coronary Heart Disease (CHD). The preliminary results obtained show a satisfactory finding that the use of medical ontologies improves the overall system performance.
Keywords: Medical Ontology, Knowledge Integration, Machine Learning, Medical Coding, Text Assignment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18555830 Perspectives on Neuropsychological Testimony
Authors: Valene J. Gresham, MA, Laura A. Brodie
Abstract:
For the last decade, statistics show traumatic brain injury (TBI) is a growing concern in our legal system. In an effort to obtain data regarding the influence of neuropsychological expert witness testimony in a criminal case, this study tested three hypotheses. H1: The majority of jurors will vote not guilty, due to mild head injury. H2: The jurors will give more credence to the testimony of the neuropsychologist rather than the psychiatrist. H3: The jurors will be more lenient in their sentencing, given the testimony of the neuropsychologist-s testimony. The criterion for inclusion in the study as a participant is identical to those used for inclusion in the eligibility for jury duty in the United States. A chisquared test was performed to analyze the data for the three hypotheses. The results supported all of the hypotheses; however statistical significance was seen in H1 and H2 only.Keywords: Expert witness, jury decision, neuropsychology, traumatic brain injury.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23375829 Computation of Flood and Drought Years over the North-West Himalayan Region Using Indian Meteorological Department Rainfall Data
Authors: Sudip Kumar Kundu, Charu Singh
Abstract:
The climatic condition over Indian region is highly dependent on monsoon. India receives maximum amount of rainfall during southwest monsoon. Indian economy is highly dependent on agriculture. The presence of flood and drought years influenced the total cultivation system as well as the economy of the country as Indian agricultural systems is still highly dependent on the monsoon rainfall. The present study has been planned to investigate the flood and drought years for the north-west Himalayan region from 1951 to 2014 by using area average Indian Meteorological Department (IMD) rainfall data. For this investigation the Normalized index (NI) has been utilized to find out whether the particular year is drought or flood. The data have been extracted for the north-west Himalayan (NWH) region states namely Uttarakhand (UK), Himachal Pradesh (HP) and Jammu and Kashmir (J&K) to find out the rainy season average rainfall for each year, climatological mean and the standard deviation. After calculation it has been plotted by the diagrams (or graphs) to show the results- some of the years associated with drought years, some are flood years and rest are neutral. The flood and drought years can also relate with the large-scale phenomena El-Nino and La-Lina.
Keywords: Indian Meteorological Department, Rainfall, Normalized index, Flood, Drought, NWH.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8195828 Analyzing Keyword Networks for the Identification of Correlated Research Topics
Authors: Thiago M. R. Dias, Patrícia M. Dias, Gray F. Moita
Abstract:
The production and publication of scientific works have increased significantly in the last years, being the Internet the main factor of access and distribution of these works. Faced with this, there is a growing interest in understanding how scientific research has evolved, in order to explore this knowledge to encourage research groups to become more productive. Therefore, the objective of this work is to explore repositories containing data from scientific publications and to characterize keyword networks of these publications, in order to identify the most relevant keywords, and to highlight those that have the greatest impact on the network. To do this, each article in the study repository has its keywords extracted and in this way the network is characterized, after which several metrics for social network analysis are applied for the identification of the highlighted keywords.Keywords: Extraction and data integration, bibliometrics, scientometrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6975827 Optimizing of Fuzzy C-Means Clustering Algorithm Using GA
Authors: Mohanad Alata, Mohammad Molhim, Abdullah Ramini
Abstract:
Fuzzy C-means Clustering algorithm (FCM) is a method that is frequently used in pattern recognition. It has the advantage of giving good modeling results in many cases, although, it is not capable of specifying the number of clusters by itself. In FCM algorithm most researchers fix weighting exponent (m) to a conventional value of 2 which might not be the appropriate for all applications. Consequently, the main objective of this paper is to use the subtractive clustering algorithm to provide the optimal number of clusters needed by FCM algorithm by optimizing the parameters of the subtractive clustering algorithm by an iterative search approach and then to find an optimal weighting exponent (m) for the FCM algorithm. In order to get an optimal number of clusters, the iterative search approach is used to find the optimal single-output Sugenotype Fuzzy Inference System (FIS) model by optimizing the parameters of the subtractive clustering algorithm that give minimum least square error between the actual data and the Sugeno fuzzy model. Once the number of clusters is optimized, then two approaches are proposed to optimize the weighting exponent (m) in the FCM algorithm, namely, the iterative search approach and the genetic algorithms. The above mentioned approach is tested on the generated data from the original function and optimal fuzzy models are obtained with minimum error between the real data and the obtained fuzzy models.Keywords: Fuzzy clustering, Fuzzy C-Means, Genetic Algorithm, Sugeno fuzzy systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32665826 Computational Fluid Dynamics Modeling of Downward Bubbly Flows
Authors: Mahmood Reza Rahimi, Hajir Karimi
Abstract:
Downward turbulent bubbly flows in pipes were modeled using computational fluid dynamics tools. The Hydrodynamics, phase distribution and turbulent structure of twophase air-water flow in a 57.15 mm diameter and 3.06 m length vertical pipe was modeled by using the 3-D Eulerian-Eulerian multiphase flow approach. Void fraction, liquid velocity and turbulent fluctuations profiles were calculated and compared against experimental data. CFD results are in good agreement with experimental data.Keywords: CFD, Bubbly flow, Vertical pipe, Population balance modeling, Gas void fraction, Liquid velocity, Normal turbulent stresses.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24885825 Data Mining to Capture User-Experience: A Case Study in Notebook Product Appearance Design
Authors: Rhoann Kerh, Chen-Fu Chien, Kuo-Yi Lin
Abstract:
In the era of rapidly increasing notebook market, consumer electronics manufacturers are facing a highly dynamic and competitive environment. In particular, the product appearance is the first part for user to distinguish the product from the product of other brands. Notebook product should differ in its appearance to engage users and contribute to the user experience (UX). The UX evaluates various product concepts to find the design for user needs; in addition, help the designer to further understand the product appearance preference of different market segment. However, few studies have been done for exploring the relationship between consumer background and the reaction of product appearance. This study aims to propose a data mining framework to capture the user’s information and the important relation between product appearance factors. The proposed framework consists of problem definition and structuring, data preparation, rules generation, and results evaluation and interpretation. An empirical study has been done in Taiwan that recruited 168 subjects from different background to experience the appearance performance of 11 different portable computers. The results assist the designers to develop product strategies based on the characteristics of consumers and the product concept that related to the UX, which help to launch the products to the right customers and increase the market shares. The results have shown the practical feasibility of the proposed framework.
Keywords: Consumers Decision Making, Product Design, Rough Set Theory, User Experience.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35175824 Evaluation of the Role of Advocacy and the Quality of Care in Reducing Health Inequalities for People with Autism, Intellectual and Developmental Disabilities at Sheffield Teaching Hospitals
Authors: Jonathan Sahu, Jill Aylott
Abstract:
Individuals with Autism, Intellectual and Developmental disabilities (AIDD) are one of the most vulnerable groups in society, hampered not only by their own limitations to understand and interact with the wider society, but also societal limitations in perception and understanding. Communication to express their needs and wishes is fundamental to enable such individuals to live and prosper in society. This research project was designed as an organisational case study, in a large secondary health care hospital within the National Health Service (NHS), to assess the quality of care provided to people with AIDD and to review the role of advocacy to reduce health inequalities in these individuals. Methods: The research methodology adopted was as an “insider researcher”. Data collection included both quantitative and qualitative data i.e. a mixed method approach. A semi-structured interview schedule was designed and used to obtain qualitative and quantitative primary data from a wide range of interdisciplinary frontline health care workers to assess their understanding and awareness of systems, processes and evidence based practice to offer a quality service to people with AIDD. Secondary data were obtained from sources within the organisation, in keeping with “Case Study” as a primary method, and organisational performance data were then compared against national benchmarking standards. Further data sources were accessed to help evaluate the effectiveness of different types of advocacy that were present in the organisation. This was gauged by measures of user and carer experience in the form of retrospective survey analysis, incidents and complaints. Results: Secondary data demonstrate near compliance of the Organisation with the current national benchmarking standard (Monitor Compliance Framework). However, primary data demonstrate poor knowledge of the Mental Capacity Act 2005, poor knowledge of organisational systems, processes and evidence based practice applied for people with AIDD. In addition there was poor knowledge and awareness of frontline health care workers of advocacy and advocacy schemes for this group. Conclusions: A significant amount of work needs to be undertaken to improve the quality of care delivered to individuals with AIDD. An operational strategy promoting the widespread dissemination of information may not be the best approach to deliver quality care and optimal patient experience and patient advocacy. In addition, a more robust set of standards, with appropriate metrics, needs to be developed to assess organisational performance which will stand the test of professional and public scrutiny.Keywords: Autism, intellectual developmental disabilities, advocacy, health inequalities, quality of care.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8965823 Impacts of Financial Development and Operating Scale on Bank Efficiencies in Taiwan
Authors: Ying-Hsiu Chen, Pao-Peng Hsu
Abstract:
This paper adopts a two-stage data envelopment analysis to explore the impacts of financial development and bank operating scale on bank efficiencies. The sample comprises unbalanced panel data of 32 Taiwanese listed domestic commercial banks over the period 1998 to 2013. Empirical results show that pure technical efficiency is positively related to financial development, whereas the effect of financial development on scale efficiency is insignificant. Enlargement of bank operating scale improves bank efficiencies, but the efficiency gains are decreased gradually when the scale increases. Increases in capital adequacy ratio and market power of loans lead into a growth of bank efficiencies.Keywords: Financial development, Operating scale, Efficiency, DEA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17525822 Comparative Study of Decision Trees and Rough Sets Theory as Knowledge ExtractionTools for Design and Control of Industrial Processes
Authors: Marcin Perzyk, Artur Soroczynski
Abstract:
General requirements for knowledge representation in the form of logic rules, applicable to design and control of industrial processes, are formulated. Characteristic behavior of decision trees (DTs) and rough sets theory (RST) in rules extraction from recorded data is discussed and illustrated with simple examples. The significance of the models- drawbacks was evaluated, using simulated and industrial data sets. It is concluded that performance of DTs may be considerably poorer in several important aspects, compared to RST, particularly when not only a characterization of a problem is required, but also detailed and precise rules are needed, according to actual, specific problems to be solved.Keywords: Knowledge extraction, decision trees, rough setstheory, industrial processes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16365821 Urbanization and Income Inequality in Thailand
Authors: Acumsiri Tantiakrnpanit
Abstract:
This paper aims to examine the relationship between urbanization and income inequality in Thailand during the period 2002–2020, using a panel of data for 76 provinces collected from Thailand’s National Statistical Office (Labor Force Survey: LFS), as well as geospatial data from the U.S. Air Force Defense Meteorological Satellite Program (DMSP) and the Visible Infrared Imaging Radiometer Suite Day/Night band (VIIRS-DNB) satellite for 19 selected years. This paper employs two different definitions to identify urban areas: 1) Urban areas defined by Thailand's National Statistical Office (LFS), and 2) Urban areas estimated using nighttime light data from the DMSP and VIIRS-DNB satellite. The second method includes two sub-categories: 2.1) Determining urban areas by calculating nighttime light density with a population density of 300 people per square kilometer, and 2.2) Calculating urban areas based on nighttime light density corresponding to a population density of 1,500 people per square kilometer. The empirical analysis based on Ordinary Least Squares (OLS), fixed effects, and random effects models reveals a consistent U-shaped relationship between income inequality and urbanization. The findings from the econometric analysis demonstrate that urbanization or population density has a significant and negative impact on income inequality. Moreover, the square of urbanization shows a statistically significant positive impact on income inequality. Additionally, there is a negative association between logarithmically transformed income and income inequality. This paper also proposes the inclusion of satellite imagery, geospatial data, and spatial econometric techniques in future studies to conduct quantitative analysis of spatial relationships.
Keywords: Income inequality, nighttime light, population density, Thailand, urbanization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1445820 Impact of Grade Sensitivity on Learning Motivation and Academic Performance
Authors: Salwa Aftab, Sehrish Riaz
Abstract:
The objective of this study was to check the impact of grade sensitivity on learning motivation and academic performance of students and to remove the degree of difference that exists among students regarding the cause of their learning motivation and also to gain knowledge about this matter since it has not been adequately researched. Data collection was primarily done through the academic sector of Pakistan and was depended upon the responses given by students solely. A sample size of 208 university students was selected. Both paper and online surveys were used to collect data from respondents. The results of the study revealed that grade sensitivity has a positive relationship with the learning motivation of students and their academic performance. These findings were carried out through systematic correlation and regression analysis.Keywords: Academic performance, correlation, grade sensitivity, learning motivation, regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27835819 3D Star Skeleton for Fast Human Posture Representation
Authors: Sungkuk Chun, Kwangjin Hong, Keechul Jung
Abstract:
In this paper, we propose an improved 3D star skeleton technique, which is a suitable skeletonization for human posture representation and reflects the 3D information of human posture. Moreover, the proposed technique is simple and then can be performed in real-time. The existing skeleton construction techniques, such as distance transformation, Voronoi diagram, and thinning, focus on the precision of skeleton information. Therefore, those techniques are not applicable to real-time posture recognition since they are computationally expensive and highly susceptible to noise of boundary. Although a 2D star skeleton was proposed to complement these problems, it also has some limitations to describe the 3D information of the posture. To represent human posture effectively, the constructed skeleton should consider the 3D information of posture. The proposed 3D star skeleton contains 3D data of human, and focuses on human action and posture recognition. Our 3D star skeleton uses the 8 projection maps which have 2D silhouette information and depth data of human surface. And the extremal points can be extracted as the features of 3D star skeleton, without searching whole boundary of object. Therefore, on execution time, our 3D star skeleton is faster than the “greedy" 3D star skeleton using the whole boundary points on the surface. Moreover, our method can offer more accurate skeleton of posture than the existing star skeleton since the 3D data for the object is concerned. Additionally, we make a codebook, a collection of representative 3D star skeletons about 7 postures, to recognize what posture of constructed skeleton is.Keywords: computer vision, gesture recognition, skeletonization, human posture representation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21275818 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective
Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou
Abstract:
The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.
Keywords: Mortality map, spatial patterns, statistical area, variation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9935817 A New Approach for Image Segmentation using Pillar-Kmeans Algorithm
Authors: Ali Ridho Barakbah, Yasushi Kiyoki
Abstract:
This paper presents a new approach for image segmentation by applying Pillar-Kmeans algorithm. This segmentation process includes a new mechanism for clustering the elements of high-resolution images in order to improve precision and reduce computation time. The system applies K-means clustering to the image segmentation after optimized by Pillar Algorithm. The Pillar algorithm considers the pillars- placement which should be located as far as possible from each other to withstand against the pressure distribution of a roof, as identical to the number of centroids amongst the data distribution. This algorithm is able to optimize the K-means clustering for image segmentation in aspects of precision and computation time. It designates the initial centroids- positions by calculating the accumulated distance metric between each data point and all previous centroids, and then selects data points which have the maximum distance as new initial centroids. This algorithm distributes all initial centroids according to the maximum accumulated distance metric. This paper evaluates the proposed approach for image segmentation by comparing with K-means and Gaussian Mixture Model algorithm and involving RGB, HSV, HSL and CIELAB color spaces. The experimental results clarify the effectiveness of our approach to improve the segmentation quality in aspects of precision and computational time.Keywords: Image segmentation, K-means clustering, Pillaralgorithm, color spaces.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3376