Search results for: missing data estimation
25493 Validation of SWAT Model for Prediction of Water Yield and Water Balance: Case Study of Upstream Catchment of Jebba Dam in Nigeria
Authors: Adeniyi G. Adeogun, Bolaji F. Sule, Adebayo W. Salami, Michael O. Daramola
Abstract:
Estimation of water yield and water balance in a river catchment is critical to the sustainable management of water resources at watershed level in any country. Therefore, in the present study, Soil and Water Assessment Tool (SWAT) interfaced with Geographical Information System (GIS) was applied as a tool to predict water balance and water yield of a catchment area in Nigeria. The catchment area, which was 12,992km2, is located upstream Jebba hydropower dam in North central part of Nigeria. In this study, data on the observed flow were collected and compared with simulated flow using SWAT. The correlation between the two data sets was evaluated using statistical measures, such as, Nasch-Sucliffe Efficiency (NSE) and coefficient of determination (R2). The model output shows a good agreement between the observed flow and simulated flow as indicated by NSE and R2, which were greater than 0.7 for both calibration and validation period. A total of 42,733 mm of water was predicted by the calibrated model as the water yield potential of the basin for a simulation period 1985 to 2010. This interesting performance obtained with SWAT model suggests that SWAT model could be a promising tool to predict water balance and water yield in sustainable management of water resources. In addition, SWAT could be applied to other water resources in other basins in Nigeria as a decision support tool for sustainable water management in Nigeria.Keywords: GIS, modeling, sensitivity analysis, SWAT, water yield, watershed level
Procedia PDF Downloads 44425492 Seismic Performance Assessment of Pre-70 RC Frame Buildings with FEMA P-58
Authors: D. Cardone
Abstract:
Past earthquakes have shown that seismic events may incur large economic losses in buildings. FEMA P-58 provides engineers a practical tool for the performance seismic assessment of buildings. In this study, FEMA P-58 is applied to two typical Italian pre-1970 reinforced concrete frame buildings, characterized by plain rebars as steel reinforcement and masonry infills and partitions. Given that suitable tools for these buildings are missing in FEMA P- 58, specific fragility curves and loss functions are first developed. Next, building performance is evaluated following a time-based assessment approach. Finally, expected annual losses for the selected buildings are derived and compared with past applications to old RC frame buildings representative of the US building stock.Keywords: FEMA P-58, RC frame buildings, plain rebars, Masonry infills, fragility functions, loss functions, expected annual loss
Procedia PDF Downloads 32625491 Semantic Data Schema Recognition
Authors: Aïcha Ben Salem, Faouzi Boufares, Sebastiao Correia
Abstract:
The subject covered in this paper aims at assisting the user in its quality approach. The goal is to better extract, mix, interpret and reuse data. It deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.Keywords: schema recognition, semantic data profiling, meta-categorisation, semantic dependencies inter columns
Procedia PDF Downloads 41925490 Robust Processing of Antenna Array Signals under Local Scattering Environments
Authors: Ju-Hong Lee, Ching-Wei Liao
Abstract:
An adaptive array beamformer is designed for automatically preserving the desired signals while cancelling interference and noise. Providing robustness against model mismatches and tracking possible environment changes calls for robust adaptive beamforming techniques. The design criterion yields the well-known generalized sidelobe canceller (GSC) beamformer. In practice, the knowledge of the desired steering vector can be imprecise, which often occurs due to estimation errors in the DOA of the desired signal or imperfect array calibration. In these situations, the SOI is considered as interference, and the performance of the GSC beamformer is known to degrade. This undesired behavior results in a reduction of the array output signal-to-interference plus-noise-ratio (SINR). Therefore, it is worth developing robust techniques to deal with the problem due to local scattering environments. As to the implementation of adaptive beamforming, the required computational complexity is enormous when the array beamformer is equipped with massive antenna array sensors. To alleviate this difficulty, a generalized sidelobe canceller (GSC) with partially adaptivity for less adaptive degrees of freedom and faster adaptive response has been proposed in the literature. Unfortunately, it has been shown that the conventional GSC-based adaptive beamformers are usually very sensitive to the mismatch problems due to local scattering situations. In this paper, we present an effective GSC-based beamformer against the mismatch problems mentioned above. The proposed GSC-based array beamformer adaptively estimates the actual direction of the desired signal by using the presumed steering vector and the received array data snapshots. We utilize the predefined steering vector and a presumed angle tolerance range to carry out the required estimation for obtaining an appropriate steering vector. A matrix associated with the direction vector of signal sources is first created. Then projection matrices related to the matrix are generated and are utilized to iteratively estimate the actual direction vector of the desired signal. As a result, the quiescent weight vector and the required signal blocking matrix required for performing adaptive beamforming can be easily found. By utilizing the proposed GSC-based beamformer, we find that the performance degradation due to the considered local scattering environments can be effectively mitigated. To further enhance the beamforming performance, a signal subspace projection matrix is also introduced into the proposed GSC-based beamformer. Several computer simulation examples show that the proposed GSC-based beamformer outperforms the existing robust techniques.Keywords: adaptive antenna beamforming, local scattering, signal blocking, steering mismatch
Procedia PDF Downloads 11625489 Access Control System for Big Data Application
Authors: Winfred Okoe Addy, Jean Jacques Dominique Beraud
Abstract:
Access control systems (ACs) are some of the most important components in safety areas. Inaccuracies of regulatory frameworks make personal policies and remedies more appropriate than standard models or protocols. This problem is exacerbated by the increasing complexity of software, such as integrated Big Data (BD) software for controlling large volumes of encrypted data and resources embedded in a dedicated BD production system. This paper proposes a general access control strategy system for the diffusion of Big Data domains since it is crucial to secure the data provided to data consumers (DC). We presented a general access control circulation strategy for the Big Data domain by describing the benefit of using designated access control for BD units and performance and taking into consideration the need for BD and AC system. We then presented a generic of Big Data access control system to improve the dissemination of Big Data.Keywords: access control, security, Big Data, domain
Procedia PDF Downloads 13725488 Tests for Zero Inflation in Count Data with Measurement Error in Covariates
Authors: Man-Yu Wong, Siyu Zhou, Zhiqiang Cao
Abstract:
In quality of life, health service utilization is an important determinant of medical resource expenditures on Colorectal cancer (CRC) care, a better understanding of the increased utilization of health services is essential for optimizing the allocation of healthcare resources to services and thus for enhancing the service quality, especially for high expenditure on CRC care like Hong Kong region. In assessing the association between the health-related quality of life (HRQOL) and health service utilization in patients with colorectal neoplasm, count data models can be used, which account for over dispersion or extra zero counts. In our data, the HRQOL evaluation is a self-reported measure obtained from a questionnaire completed by the patients, misreports and variations in the data are inevitable. Besides, there are more zero counts from the observed number of clinical consultations (observed frequency of zero counts = 206) than those from a Poisson distribution with mean equal to 1.33 (expected frequency of zero counts = 156). This suggests that excess of zero counts may exist. Therefore, we study tests for detecting zero-inflation in models with measurement error in covariates. Method: Under classical measurement error model, the approximate likelihood function for zero-inflation Poisson regression model can be obtained, then Approximate Maximum Likelihood Estimation(AMLE) can be derived accordingly, which is consistent and asymptotically normally distributed. By calculating score function and Fisher information based on AMLE, a score test is proposed to detect zero-inflation effect in ZIP model with measurement error. The proposed test follows asymptotically standard normal distribution under H0, and it is consistent with the test proposed for zero-inflation effect when there is no measurement error. Results: Simulation results show that empirical power of our proposed test is the highest among existing tests for zero-inflation in ZIP model with measurement error. In real data analysis, with or without considering measurement error in covariates, existing tests, and our proposed test all imply H0 should be rejected with P-value less than 0.001, i.e., zero-inflation effect is very significant, ZIP model is superior to Poisson model for analyzing this data. However, if measurement error in covariates is not considered, only one covariate is significant; if measurement error in covariates is considered, only another covariate is significant. Moreover, the direction of coefficient estimations for these two covariates is different in ZIP regression model with or without considering measurement error. Conclusion: In our study, compared to Poisson model, ZIP model should be chosen when assessing the association between condition-specific HRQOL and health service utilization in patients with colorectal neoplasm. and models taking measurement error into account will result in statistically more reliable and precise information.Keywords: count data, measurement error, score test, zero inflation
Procedia PDF Downloads 28925487 Understanding Chronic Pain: Missing the Mark
Authors: Rachid El Khoury
Abstract:
Chronic pain is perhaps the most burdensome health issue facing the planet. Our understanding of the pathophysiology of chronic pain has increased substantially over the past 25 years, including but not limited to changes in the brain. However, we still do not know why chronic pain develops in some people and not in others. Most of the recent developments in pain science, that have direct relevance to clinical management, relate to our understanding of the role of the brain, the role of the immune system, or the role of cognitive and behavioral factors. Although the Biopsychosocial model of pain management was presented decades ago, the Bio-reductionist model remains, unfortunately, at the heart of many practices across professional and geographic boundaries. A large body of evidence shows that nociception is neither sufficient nor necessary for pain. Pain is a conscious experience that can certainly be, and often is, associated with nociception, however, always modulated by countless neurobiological, environmental, and cognitive factors. This study will clarify the current misconceptions of chronic pain concepts, and their misperceptions by clinicians. It will also attempt to bridge the considerable gap between what we already know on pain but somehow disregarded, the development in pain science, and clinical practice.Keywords: chronic pain, nociception, biopsychosocial, neuroplasticity
Procedia PDF Downloads 6525486 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier
Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh
Abstract:
This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems
Procedia PDF Downloads 4825485 A Data Envelopment Analysis Model in a Multi-Objective Optimization with Fuzzy Environment
Authors: Michael Gidey Gebru
Abstract:
Most of Data Envelopment Analysis models operate in a static environment with input and output parameters that are chosen by deterministic data. However, due to ambiguity brought on shifting market conditions, input and output data are not always precisely gathered in real-world scenarios. Fuzzy numbers can be used to address this kind of ambiguity in input and output data. Therefore, this work aims to expand crisp Data Envelopment Analysis into Data Envelopment Analysis with fuzzy environment. In this study, the input and output data are regarded as fuzzy triangular numbers. Then, the Data Envelopment Analysis model with fuzzy environment is solved using a multi-objective method to gauge the Decision Making Units' efficiency. Finally, the developed Data Envelopment Analysis model is illustrated with an application on real data 50 educational institutions.Keywords: efficiency, Data Envelopment Analysis, fuzzy, higher education, input, output
Procedia PDF Downloads 6425484 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring
Authors: Zheng Wang, Zhenhong Li, Jon Mills
Abstract:
Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring
Procedia PDF Downloads 16425483 Stability Indicating RP – HPLC Method Development, Validation and Kinetic Study for Amiloride Hydrochloride and Furosemide in Pharmaceutical Dosage Form
Authors: Jignasha Derasari, Patel Krishna M, Modi Jignasa G.
Abstract:
Chemical stability of pharmaceutical molecules is a matter of great concern as it affects the safety and efficacy of the drug product.Stability testing data provides the basis to understand how the quality of a drug substance and drug product changes with time under the influence of various environmental factors. Besides this, it also helps in selecting proper formulation and package as well as providing proper storage conditions and shelf life, which is essential for regulatory documentation. The ICH guideline states that stress testing is intended to identify the likely degradation products which further help in determination of the intrinsic stability of the molecule and establishing degradation pathways, and to validate the stability indicating procedures. A simple, accurate and precise stability indicating RP- HPLC method was developed and validated for simultaneous estimation of Amiloride Hydrochloride and Furosemide in tablet dosage form. Separation was achieved on an Phenomenexluna ODS C18 (250 mm × 4.6 mm i.d., 5 µm particle size) by using a mobile phase consisting of Ortho phosphoric acid: Acetonitrile (50:50 %v/v) at a flow rate of 1.0 ml/min (pH 3.5 adjusted with 0.1 % TEA in Water) isocratic pump mode, Injection volume 20 µl and wavelength of detection was kept at 283 nm. Retention time for Amiloride Hydrochloride and Furosemide was 1.810 min and 4.269 min respectively. Linearity of the proposed method was obtained in the range of 40-60 µg/ml and 320-480 µg/ml and Correlation coefficient was 0.999 and 0.998 for Amiloride hydrochloride and Furosemide, respectively. Forced degradation study was carried out on combined dosage form with various stress conditions like hydrolysis (acid and base hydrolysis), oxidative and thermal conditions as per ICH guideline Q2 (R1). The RP- HPLC method has shown an adequate separation for Amiloride hydrochloride and Furosemide from its degradation products. Proposed method was validated as per ICH guidelines for specificity, linearity, accuracy; precision and robustness for estimation of Amiloride hydrochloride and Furosemide in commercially available tablet dosage form and results were found to be satisfactory and significant. The developed and validated stability indicating RP-HPLC method can be used successfully for marketed formulations. Forced degradation studies help in generating degradants in much shorter span of time, mostly a few weeks can be used to develop the stability indicating method which can be applied later for the analysis of samples generated from accelerated and long term stability studies. Further, kinetic study was also performed for different forced degradation parameters of the same combination, which help in determining order of reaction.Keywords: amiloride hydrochloride, furosemide, kinetic study, stability indicating RP-HPLC method validation
Procedia PDF Downloads 46725482 Integrating Nursing Informatics to Improve Patient-Centered Care: A Project to Reduce Patient Waiting Time at the Blood Pressure Counter
Authors: Pi-Chi Wu, Tsui-Ping Chu, Hsiu-Hung Wang
Abstract:
Background: The ability to provide immediate medical service in outpatient departments is one of the keys to patient satisfaction. Objectives: This project used electronic equipment to integrate nursing care information to patient care at a blood pressure diagnostic counter. Through process reengineering, the average patient waiting time decreased from 35 minutes to 5 minutes, while service satisfaction increased from a score of 2.7 to 4.6. Methods: Data was collected from a local hospital in Southern Taiwan from a daily average of 2,200 patients in the outpatient department. Previous waiting times were affected by (1) space limitations, (2) the need to help guide patient mobility, (3) the need for nurses to appease irate patients and give instructions, (4), the need for patients to replace lost counter tickets, (5) the need to re-enter information, (6) the replacement of missing patient information. An ad hoc group was established to enhance patient satisfaction and shorten waiting times for patients to see a doctor. A four step strategy consisting of (1) counter relocation, (2) queue reorganization, (3) electronic information integration, (4) process reengineering was implemented. Results: Implementation of the developed strategy decreased patient waiting time from 35 minutes to an average of 5 minutes, and increased patient satisfaction scores from 2.7 to 6.4. Conclusion: Through the integration of information technology and process transformation, waiting times were drastically reduced, patient satisfaction increased, and nurses were allowed more time to engage in more cost-effective services. This strategy was simultaneously enacted in separate hospitals throughout Taiwan.Keywords: process reengineering, electronic information integration, patient satisfaction, patient waiting time
Procedia PDF Downloads 38025481 A 3Y/3Y Pole-Changing Winding of High-Power Asynchronous Motors
Authors: Gábor Kovács
Abstract:
Requirement for pole-changing motors emerged at the very early times of asynchronous motor design. Different solutions have been elaborated and some of them are generally used. An alternative is the so called 3 Y/3 Y pole-changing winding. This paper deals with high power application of this solution. A complete and comprehensive study is introduced, including features and design guidelines. The method presented in this paper is especially suitable for pole numbers being close to each other. The study also reveals that the method is more advantageous then the existing solutions for high power motors with 1:3 pole ratio. Using this motor, a new and complete drive supply system has been proposed as most appropriate arrangement of high power main naval propulsion drive. Further, the method makes possible to extend the pole ratio to 1:6, 1:9, 1:12, etc. At the end, the proposal is further extended to the here so far missing 1:4, 1:5, 1:7 etc. pole ratios. A complete proposal for the theoretically infinite range has been given in this way.Keywords: induction motor, pole changing 3Y/3Y, pole phase modulation, pole changing 1:3, 1:6
Procedia PDF Downloads 16925480 Estimating Solar Irradiance on a Tilted Surface Using Artificial Neural Networks with Differential Outputs
Authors: Hsu-Yung Cheng, Kuo-Chang Hsu, Chi-Chang Chan, Mei-Hui Tseng, Chih-Chang Yu, Ya-Sheng Liu
Abstract:
Photovoltaics modules are usually not installed horizontally to avoid water or dust accumulation. However, the measured irradiance data on tilted surfaces are rarely available since installing pyranometers with various tilt angles induces high costs. Therefore, estimating solar irradiance on tilted surfaces is an important research topic. In this work, artificial neural networks (ANN) are utilized to construct the transfer model to estimate solar irradiance on tilted surfaces. Instead of predicting tilted irradiance directly, the proposed method estimates the differences between the horizontal irradiance and the irradiance on a tilted surface. The outputs of the ANNs in the proposed design are differential values. The experimental results have shown that the proposed ANNs with differential outputs can substantially improve the estimation accuracy compared to ANNs that estimate the titled irradiance directly.Keywords: photovoltaics, artificial neural networks, tilted irradiance, solar energy
Procedia PDF Downloads 40025479 Detecting Heartbeat Architectural Tactic in Source Code Using Program Analysis
Authors: Ananta Kumar Das, Sujit Kumar Chakrabarti
Abstract:
Architectural tactics such as heartbeat, ping-echo, encapsulate, encrypt data are techniques that are used to achieve quality attributes of a system. Detecting architectural tactics has several benefits: it can aid system comprehension (e.g., legacy systems) and in the estimation of quality attributes such as safety, security, maintainability, etc. Architectural tactics are typically spread over the source code and are implicit. For large codebases, manual detection is often not feasible. Therefore, there is a need for automated methods of detection of architectural tactics. This paper presents a formalization of the heartbeat architectural tactic and a program analytic approach to detect this tactic in source code. The experiment of the proposed method is done on a set of Java applications. The outcome of the experiment strongly suggests that the method compares well with a manual approach in terms of its sensitivity and specificity, and far supersedes a manual exercise in terms of its scalability.Keywords: software architecture, architectural tactics, detecting architectural tactics, program analysis, AST, alias analysis
Procedia PDF Downloads 16225478 Sparsity Order Selection and Denoising in Compressed Sensing Framework
Authors: Mahdi Shamsi, Tohid Yousefi Rezaii, Siavash Eftekharifar
Abstract:
Compressed sensing (CS) is a new powerful mathematical theory concentrating on sparse signals which is widely used in signal processing. The main idea is to sense sparse signals by far fewer measurements than the Nyquist sampling rate, but the reconstruction process becomes nonlinear and more complicated. Common dilemma in sparse signal recovery in CS is the lack of knowledge about sparsity order of the signal, which can be viewed as model order selection procedure. In this paper, we address the problem of sparsity order estimation in sparse signal recovery. This is of main interest in situations where the signal sparsity is unknown or the signal to be recovered is approximately sparse. It is shown that the proposed method also leads to some kind of signal denoising, where the observations are contaminated with noise. Finally, the performance of the proposed approach is evaluated in different scenarios and compared to an existing method, which shows the effectiveness of the proposed method in terms of order selection as well as denoising.Keywords: compressed sensing, data denoising, model order selection, sparse representation
Procedia PDF Downloads 48425477 Fatigue Life Prediction under Variable Loading Based a Non-Linear Energy Model
Authors: Aid Abdelkrim
Abstract:
A method of fatigue damage accumulation based upon application of energy parameters of the fatigue process is proposed in the paper. Using this model is simple, it has no parameter to be determined, it requires only the knowledge of the curve W–N (W: strain energy density N: number of cycles at failure) determined from the experimental Wöhler curve. To examine the performance of nonlinear models proposed in the estimation of fatigue damage and fatigue life of components under random loading, a batch of specimens made of 6082 T 6 aluminium alloy has been studied and some of the results are reported in the present paper. The paper describes an algorithm and suggests a fatigue cumulative damage model, especially when random loading is considered. This work contains the results of uni-axial random load fatigue tests with different mean and amplitude values performed on 6082T6 aluminium alloy specimens. The proposed model has been formulated to take into account the damage evolution at different load levels and it allows the effect of the loading sequence to be included by means of a recurrence formula derived for multilevel loading, considering complex load sequences. It is concluded that a ‘damaged stress interaction damage rule’ proposed here allows a better fatigue damage prediction than the widely used Palmgren–Miner rule, and a formula derived in random fatigue could be used to predict the fatigue damage and fatigue lifetime very easily. The results obtained by the model are compared with the experimental results and those calculated by the most fatigue damage model used in fatigue (Miner’s model). The comparison shows that the proposed model, presents a good estimation of the experimental results. Moreover, the error is minimized in comparison to the Miner’s model.Keywords: damage accumulation, energy model, damage indicator, variable loading, random loading
Procedia PDF Downloads 39725476 Predicting Shot Making in Basketball Learnt Fromadversarial Multiagent Trajectories
Authors: Mark Harmon, Abdolghani Ebrahimi, Patrick Lucey, Diego Klabjan
Abstract:
In this paper, we predict the likelihood of a player making a shot in basketball from multiagent trajectories. Previous approaches to similar problems center on hand-crafting features to capture domain-specific knowledge. Although intuitive, recent work in deep learning has shown, this approach is prone to missing important predictive features. To circumvent this issue, we present a convolutional neural network (CNN) approach where we initially represent the multiagent behavior as an image. To encode the adversarial nature of basketball, we use a multichannel image which we then feed into a CNN. Additionally, to capture the temporal aspect of the trajectories, we use “fading.” We find that this approach is superior to a traditional FFN model. By using gradient ascent, we were able to discover what the CNN filters look for during training. Last, we find that a combined FFN+CNN is the best performing network with an error rate of 39%.Keywords: basketball, computer vision, image processing, convolutional neural network
Procedia PDF Downloads 15625475 Characterization of Agroforestry Systems in Burkina Faso Using an Earth Observation Data Cube
Authors: Dan Kanmegne
Abstract:
Africa will become the most populated continent by the end of the century, with around 4 billion inhabitants. Food security and climate changes will become continental issues since agricultural practices depend on climate but also contribute to global emissions and land degradation. Agroforestry has been identified as a cost-efficient and reliable strategy to address these two issues. It is defined as the integrated management of trees and crops/animals in the same land unit. Agroforestry provides benefits in terms of goods (fruits, medicine, wood, etc.) and services (windbreaks, fertility, etc.), and is acknowledged to have a great potential for carbon sequestration; therefore it can be integrated into reduction mechanisms of carbon emissions. Particularly in sub-Saharan Africa, the constraint stands in the lack of information about both areas under agroforestry and the characterization (composition, structure, and management) of each agroforestry system at the country level. This study describes and quantifies “what is where?”, earliest to the quantification of carbon stock in different systems. Remote sensing (RS) is the most efficient approach to map such a dynamic technology as agroforestry since it gives relatively adequate and consistent information over a large area at nearly no cost. RS data fulfill the good practice guidelines of the Intergovernmental Panel On Climate Change (IPCC) that is to be used in carbon estimation. Satellite data are getting more and more accessible, and the archives are growing exponentially. To retrieve useful information to support decision-making out of this large amount of data, satellite data needs to be organized so to ensure fast processing, quick accessibility, and ease of use. A new solution is a data cube, which can be understood as a multi-dimensional stack (space, time, data type) of spatially aligned pixels and used for efficient access and analysis. A data cube for Burkina Faso has been set up from the cooperation project between the international service provider WASCAL and Germany, which provides an accessible exploitation architecture of multi-temporal satellite data. The aim of this study is to map and characterize agroforestry systems using the Burkina Faso earth observation data cube. The approach in its initial stage is based on an unsupervised image classification of a normalized difference vegetation index (NDVI) time series from 2010 to 2018, to stratify the country based on the vegetation. Fifteen strata were identified, and four samples per location were randomly assigned to define the sampling units. For safety reasons, the northern part will not be part of the fieldwork. A total of 52 locations will be visited by the end of the dry season in February-March 2020. The field campaigns will consist of identifying and describing different agroforestry systems and qualitative interviews. A multi-temporal supervised image classification will be done with a random forest algorithm, and the field data will be used for both training the algorithm and accuracy assessment. The expected outputs are (i) map(s) of agroforestry dynamics, (ii) characteristics of different systems (main species, management, area, etc.); (iii) assessment report of Burkina Faso data cube.Keywords: agroforestry systems, Burkina Faso, earth observation data cube, multi-temporal image classification
Procedia PDF Downloads 15025474 The Educational, Social and Cultural Significance of Boys Choirs
Authors: Johannes Van Der Sandt
Abstract:
Worldwide, there are many boys choirs, but the Drakensberg Boys Choir is one of only a few of its kind: selected from a residential boys choir school using choral music as a significant vehicle for holistic education. With ongoing debates as to whether single-gender education is advantageous for boys, and research on the missing males in choirs problem, this presentation‘s purpose is to explore the perceived benefits and values for boys singing in the world-renowned Drakensberg Boys Choir, and to establish educational grounds for the existence of boys choirs. Semi-structured questionnaires were given to choristers, known as Drakies, to ascertain their perceptions of their choir membership. Their experiences are noted in terms of musical, social and behavioral skills gained. The main emerging themes in each category are discussed in order to lay claim to the assumption that boys choirs exist not only to entertain, and nor are their goals purely musical or pedagogical, but that they can be regarded as unique, cultural artifacts that aid boys‘ development into well-equipped and well-rounded young men.Keywords: boys, choirs, choral, education, skills, values
Procedia PDF Downloads 20325473 Metrology in Egyptian Architecture, Interrelation with Archaeology
Authors: Monica M. Marcos
Abstract:
In the framework of Archaeological Research, Heritage Conservation and Restoration, the object of study is metrology applied in composition of religious architecture in ancient Egypt, and usefulness in Archaology. The objective is the determination of the geometric and metrological relations in architectural models and the module used in the initial project of the buildings. The study and data collection of religious buildings, tombs and temples of the ancient Egypt, is completed with plans. The measurements systematization and buildings modulation makes possible to establish common compositional parameters, with a module determined by the measurement unit used. The measurement system corresponding to the main period of egyptian history, was the Egyptian royal cubit. The analysis of units measurements, used in architectural design, provides exact numbers on buildable spaces dimensions. It allows establishing proportional relationships between them, and finding a geometric composition module, on which the original project was based. This responds to a philosophical and functional concept of projected spaces. In the heritage rehabilitation and restoration field, knowledge of metrology helps in excavation, reconstruction and restoration of construction elements. The correct use of metrology contributes to the identification of possible work areas, helping to locate where the damaged or missing areas are. Also in restoration projects, metrology is useful for reordering and locating decontextualized parts of buildings. The conversion of measurements taken in the current International System to the ancient egyptian measurements, allows understand its conceptual purpose and its functionality, which makes easier to carry out archaeological intervention. In the work carried out in archaeological excavations, metrology is an essential tool for locating sites and establishing work zones.Keywords: egyptology, metrology, archaeology, measurements, Egyptian cubit
Procedia PDF Downloads 2725472 Design and Implementation of Testable Reversible Sequential Circuits Optimized Power
Authors: B. Manikandan, A. Vijayaprabhu
Abstract:
The conservative reversible gates are used to designed reversible sequential circuits. The sequential circuits are flip-flops and latches. The conservative logic gates are Feynman, Toffoli, and Fredkin. The design of two vectors testable sequential circuits based on conservative logic gates. All sequential circuit based on conservative logic gates can be tested for classical unidirectional stuck-at faults using only two test vectors. The two test vectors are all 1s, and all 0s. The designs of two vectors testable latches, master-slave flip-flops and double edge triggered (DET) flip-flops are presented. We also showed the application of the proposed approach toward 100% fault coverage for single missing/additional cell defect in the quantum- dot cellular automata (QCA) layout of the Fredkin gate. The conservative logic gates are in terms of complexity, speed, and area.Keywords: DET, QCA, reversible logic gates, POS, SOP, latches, flip flops
Procedia PDF Downloads 30725471 Evaluating the Implementation of a Quality Management System in the COVID-19 Diagnostic Laboratory of a Tertiary Care Hospital in Delhi
Authors: Sukriti Sabharwal, Sonali Bhattar, Shikhar Saxena
Abstract:
Introduction: COVID-19 molecular diagnostic laboratory is the cornerstone of the COVID-19 disease diagnosis as the patient’s treatment and management protocol depend on the molecular results. For this purpose, it is extremely important that the laboratory conducting these results adheres to the quality management processes to increase the accuracy and validity of the reports generated. We started our own molecular diagnostic setup at the onset of the pandemic. Therefore, we conducted this study to generate our quality management data to help us in improving on our weak points. Materials and Methods: A total of 14561 samples were evaluated by the retrospective observational method. The quality variables analysed were classified into pre-analytical, analytical, and post-analytical variables, and the results were presented in percentages. Results: Among the pre-analytical variables, sample leaking was the most common cause of the rejection of samples (134/14561, 0.92%), followed by non-generation of SRF ID (76/14561, 0.52%) and non-compliance to triple packaging (44/14561, 0.3%). The other pre-analytical aspects assessed were incomplete patient identification (17/14561, 0.11%), insufficient quantity of samples (12/14561, 0.08%), missing forms/samples (7/14561, 0.04%), samples in the wrong vials/empty VTM tubes (5/14561, 0.03%) and LIMS entry not done (2/14561, 0.01%). We are unable to obtain internal quality control in 0.37% of samples (55/14561). We also experienced two incidences of cross-contamination among the samples resulting in false-positive results. Among the post-analytical factors, a total of 0.07% of samples (11/14561) could not be dispatched within the stipulated time frame. Conclusion: Adherence to quality control processes is foremost for the smooth running of any diagnostic laboratory, especially the ones involved in critical reporting. Not only do the indicators help in keeping in check the laboratory parameters but they also allow comparison with other laboratories.Keywords: laboratory quality management, COVID-19, molecular diagnostics, healthcare
Procedia PDF Downloads 16725470 Performance Evaluation of a Small Microturbine Cogeneration Functional Model
Authors: Jeni A. Popescu, Sorin G. Tomescu, Valeriu A. Vilag
Abstract:
The paper focuses on the potential methods of increasing the performance of a microturbine by combining additional elements available for utilization in a cogeneration plant. The activity is carried out within the framework of a project aiming to develop, manufacture and test a microturbine functional model with high potential in energetic industry utilization. The main goal of the analysis is to determine the parameters of the fluid flow passing through each section of the turbine, based on limited data available in literature for the focus output power range or provided by experimental studies, starting from a reference cycle, and considering different cycle options, including simple, intercooled and recuperated options, in order to optimize a small cogeneration plant operation. The studied configurations operate under the same initial thermodynamic conditions and are based on a series of assumptions, in terms of individual performance of the components, pressure/velocity losses, compression ratios, and efficiencies. The thermodynamic analysis evaluates the expected performance of the microturbine cycle, while providing a series of input data and limitations to be included in the development of the experimental plan. To simplify the calculations and to allow a clear estimation of the effect of heat transfer between fluids, the working fluid for all the thermodynamic evolutions is, initially, air, the combustion being modelled by simple heat addition to the system. The theoretical results, along with preliminary experimental results are presented, aiming for a correlation in terms of microturbine performance.Keywords: cogeneration, microturbine, performance, thermodynamic analysis
Procedia PDF Downloads 17125469 Estimation of the Drought Index Based on the Climatic Projections of Precipitation of the Uruguay River Basin
Authors: José Leandro Melgar Néris, Claudinéia Brazil, Luciane Teresa Salvi, Isabel Cristina Damin
Abstract:
The impact the climate change is not recent, the main variable in the hydrological cycle is the sequence and shortage of a drought, which has a significant impact on the socioeconomic, agricultural and environmental spheres. This study aims to characterize and quantify, based on precipitation climatic projections, the rainy and dry events in the region of the Uruguay River Basin, through the Standardized Precipitation Index (SPI). The database is the image that is part of the Intercomparison of Model Models, Phase 5 (CMIP5), which provides condition prediction models, organized according to the Representative Routes of Concentration (CPR). Compared to the normal set of climates in the Uruguay River Watershed through precipitation projections, seasonal precipitation increases for all proposed scenarios, with a low climate trend. From the data of this research, the idea is that this article can be used to support research and the responsible bodies can use it as a subsidy for mitigation measures in other hydrographic basins.Keywords: climate change, climatic model, dry events, precipitation projections
Procedia PDF Downloads 14725468 The Economic Limitations of Defining Data Ownership Rights
Authors: Kacper Tomasz Kröber-Mulawa
Abstract:
This paper will address the topic of data ownership from an economic perspective, and examples of economic limitations of data property rights will be provided, which have been identified using methods and approaches of economic analysis of law. To properly build a background for the economic focus, in the beginning a short perspective of data and data ownership in the EU’s legal system will be provided. It will include a short introduction to its political and social importance and highlight relevant viewpoints. This will stress the importance of a Single Market for data but also far-reaching regulations of data governance and privacy (including the distinction of personal and non-personal data, data held by public bodies and private businesses). The main discussion of this paper will build upon the briefly referred to legal basis as well as methods and approaches of economic analysis of law.Keywords: antitrust, data, data ownership, digital economy, property rights
Procedia PDF Downloads 8725467 Protecting the Cloud Computing Data Through the Data Backups
Authors: Abdullah Alsaeed
Abstract:
Virtualized computing and cloud computing infrastructures are no longer fuzz or marketing term. They are a core reality in today’s corporate Information Technology (IT) organizations. Hence, developing an effective and efficient methodologies for data backup and data recovery is required more than any time. The purpose of data backup and recovery techniques are to assist the organizations to strategize the business continuity and disaster recovery approaches. In order to accomplish this strategic objective, a variety of mechanism were proposed in the recent years. This research paper will explore and examine the latest techniques and solutions to provide data backup and restoration for the cloud computing platforms.Keywords: data backup, data recovery, cloud computing, business continuity, disaster recovery, cost-effective, data encryption.
Procedia PDF Downloads 9025466 Infrastructure Change Monitoring Using Multitemporal Multispectral Satellite Images
Authors: U. Datta
Abstract:
The main objective of this study is to find a suitable approach to monitor the land infrastructure growth over a period of time using multispectral satellite images. Bi-temporal change detection method is unable to indicate the continuous change occurring over a long period of time. To achieve this objective, the approach used here estimates a statistical model from series of multispectral image data over a long period of time, assuming there is no considerable change during that time period and then compare it with the multispectral image data obtained at a later time. The change is estimated pixel-wise. Statistical composite hypothesis technique is used for estimating pixel based change detection in a defined region. The generalized likelihood ratio test (GLRT) is used to detect the changed pixel from probabilistic estimated model of the corresponding pixel. The changed pixel is detected assuming that the images have been co-registered prior to estimation. To minimize error due to co-registration, 8-neighborhood pixels around the pixel under test are also considered. The multispectral images from Sentinel-2 and Landsat-8 from 2015 to 2018 are used for this purpose. There are different challenges in this method. First and foremost challenge is to get quite a large number of datasets for multivariate distribution modelling. A large number of images are always discarded due to cloud coverage. Due to imperfect modelling there will be high probability of false alarm. Overall conclusion that can be drawn from this work is that the probabilistic method described in this paper has given some promising results, which need to be pursued further.Keywords: co-registration, GLRT, infrastructure growth, multispectral, multitemporal, pixel-based change detection
Procedia PDF Downloads 13825465 Board Gender Diversity and Firm Sustainable Investment: An Empirical Evidence
Authors: Muhammad Atif, M. Samsul Alam
Abstract:
The purpose of this study is to investigate the effects of board room gender diversity on firm sustainable investment. We test the extent to which sustainable investment is affected by the presence of female directors on U.S. corporate boards. Using data of S&P 1500 indexed firms collected from Bloomberg covering the period 2004-2016, we estimate the baseline model to investigate the effects of board room gender diversity on firm sustainable investment. We find a positive relationship between board gender diversity and sustainable investment. We also find that boards with two or more women have a pronounced impact on sustainable investment, consistent with the critical mass theory. Female independent directors have a stronger impact on sustainable investment than female executive directors. Our findings are robust to different identification and estimation techniques. The study offers another perspective of the ongoing debate in the social responsibility literature about the accountability relationships between business and society.Keywords: sustainable investment, gender diversity, environmental proctection, social responsibility
Procedia PDF Downloads 16525464 Development of Industry Sector Specific Factory Standards
Authors: Peter Burggräf, Moritz Krunke, Hanno Voet
Abstract:
Due to shortening product and technology lifecycles, many companies use standardization approaches in product development and factory planning to reduce costs and time to market. Unlike large companies, where modular systems are already widely used, small and medium-sized companies often show a much lower degree of standardization due to lower scale effects and missing capacities for the development of these standards. To overcome these challenges, the development of industry sector specific standards in cooperations or by third parties is an interesting approach. This paper analyzes which branches that are mainly dominated by small or medium-sized companies might be especially interesting for the development of factory standards using the example of the German industry. For this, a key performance indicator based approach was developed that will be presented in detail with its specific results for the German industry structure.Keywords: factory planning, factory standards, industry sector specific standardization, production planning
Procedia PDF Downloads 397