Search results for: metrics
19 Comparative Usability Study of the Websites of Top Universities in Three Continents: A Case Study of the University of Cape Town, Oxford University, and Harvard University
Authors: Stephen Akuma, Racheal Aluma, Abraham Undu
Abstract:
Academic websites play an important role in promoting education for all. They allow universities to provide users with digital academic services to save time and resources. A university website is not only a cost-effective and timely way to communicate with a variety of stakeholders, such as students, faculty, and visitors, but it is also a vehicle for the university to shape its image. The quality of a website is a major factor that universities consider in cyberspace. Potential students can easily apply to universities where the website provides useful and clear information. This has made the usability of websites an important area in meeting the needs and expectations of website users. In this paper, a comparative usability study of the University of Cape Town, Oxford University, and Harvard University academic websites (http://www.uct.ac.za/, https://www.ox.ac.uk/, and https://www.harvard.edu/) was carried out. The proactive user feedback technique was adopted for the comparative usability assessment of the aforementioned universities. The method was used by the researchers to collect and log records from the participants in real time. The result shows that the average dwell time on the websites of Harvard University, Oxford University, and Cape Town University in seconds for the three tasks are 51.58, 33.28, and 54.82 respectively. The System Usability Scale (SUS) scores for Harvard, Oxford, and the University of Cape Town are 49.81, 69.43, and 54.14 respectively. The result of the Analysis of Variance on the dwell time data shows a significant difference (p = .009) on the three websites. Our findings show that Oxford University has the most suitable website in terms of usability factors and other metrics than the other websites investigated. Practical implications are highlighted, and recommendations for improved website usability are suggested.
Keywords: Usability factors, user feedback, university websites, University of Cape Town, Harvard University, Oxford University.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16118 Performance Analysis of Search Medical Imaging Service on Cloud Storage Using Decision Trees
Authors: González A. Julio, Ramírez L. Leonardo, Puerta A. Gabriel
Abstract:
Telemedicine services use a large amount of data, most of which are diagnostic images in Digital Imaging and Communications in Medicine (DICOM) and Health Level Seven (HL7) formats. Metadata is generated from each related image to support their identification. This study presents the use of decision trees for the optimization of information search processes for diagnostic images, hosted on the cloud server. To analyze the performance in the server, the following quality of service (QoS) metrics are evaluated: delay, bandwidth, jitter, latency and throughput in five test scenarios for a total of 26 experiments during the loading and downloading of DICOM images, hosted by the telemedicine group server of the Universidad Militar Nueva Granada, Bogotá, Colombia. By applying decision trees as a data mining technique and comparing it with the sequential search, it was possible to evaluate the search times of diagnostic images in the server. The results show that by using the metadata in decision trees, the search times are substantially improved, the computational resources are optimized and the request management of the telemedicine image service is improved. Based on the experiments carried out, search efficiency increased by 45% in relation to the sequential search, given that, when downloading a diagnostic image, false positives are avoided in management and acquisition processes of said information. It is concluded that, for the diagnostic images services in telemedicine, the technique of decision trees guarantees the accessibility and robustness in the acquisition and manipulation of medical images, in improvement of the diagnoses and medical procedures in patients.
Keywords: Cloud storage, decision trees, diagnostic image, search, telemedicine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 94917 Accuracy of Peak Demand Estimates for Office Buildings Using eQUEST
Authors: Mahdiyeh Zafaranchi, Ethan S. Cantor, William T. Riddell, Jess W. Everett
Abstract:
The New Jersey Department of Military and Veteran’s Affairs (NJ DMAVA) operates over 50 facilities throughout the state of New Jersey, US. NJ DMAVA is under a mandate to move toward decarbonization, which will eventually include eliminating the use of natural gas and other fossil fuels for heating. At the same time, the organization requires increased resiliency regarding electric grid disruption. These competing goals necessitate adopting the use of on-site renewables such as photovoltaic and geothermal power, as well as implementing power control strategies through microgrids. Planning for these changes requires a detailed understanding of current and future electricity use on yearly, monthly, and shorter time scales, as well as a breakdown of consumption by heating, ventilation, and air conditioning (HVAC) equipment. This paper discusses case studies of two buildings that were simulated using the QUick Energy Simulation Tool (eQUEST). Both buildings use electricity from the grid and photovoltaics. One building also uses natural gas. While electricity use data are available in hourly intervals and natural gas data are available in monthly intervals, the simulations were developed using monthly and yearly totals. This approach was chosen to reflect the information available for most NJ DMAVA facilities. Once completed, simulation results are compared to metrics recommended by several organizations to validate energy use simulations. In addition to yearly and monthly totals, the simulated peak demands are compared to actual monthly peak demand values. The simulations resulted in monthly peak demand values that were within 30% of the measured values. These benchmarks will help to assess future energy planning efforts for NJ DMAVA.
Keywords: Building Energy Modeling, eQUEST, peak demand, smart meters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18416 An Automated Stock Investment System Using Machine Learning Techniques: An Application in Australia
Authors: Carol Anne Hargreaves
Abstract:
A key issue in stock investment is how to select representative features for stock selection. The objective of this paper is to firstly determine whether an automated stock investment system, using machine learning techniques, may be used to identify a portfolio of growth stocks that are highly likely to provide returns better than the stock market index. The second objective is to identify the technical features that best characterize whether a stock’s price is likely to go up and to identify the most important factors and their contribution to predicting the likelihood of the stock price going up. Unsupervised machine learning techniques, such as cluster analysis, were applied to the stock data to identify a cluster of stocks that was likely to go up in price – portfolio 1. Next, the principal component analysis technique was used to select stocks that were rated high on component one and component two – portfolio 2. Thirdly, a supervised machine learning technique, the logistic regression method, was used to select stocks with a high probability of their price going up – portfolio 3. The predictive models were validated with metrics such as, sensitivity (recall), specificity and overall accuracy for all models. All accuracy measures were above 70%. All portfolios outperformed the market by more than eight times. The top three stocks were selected for each of the three stock portfolios and traded in the market for one month. After one month the return for each stock portfolio was computed and compared with the stock market index returns. The returns for all three stock portfolios was 23.87% for the principal component analysis stock portfolio, 11.65% for the logistic regression portfolio and 8.88% for the K-means cluster portfolio while the stock market performance was 0.38%. This study confirms that an automated stock investment system using machine learning techniques can identify top performing stock portfolios that outperform the stock market.
Keywords: Machine learning, stock market trading, logistic principal component analysis, automated stock investment system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 109915 An Overview of Project Management Application in Computational Fluid Dynamics
Authors: Sajith Sajeev
Abstract:
The application of Computational Fluid Dynamics (CFD) is widespread in engineering and industry, including aerospace, automotive, and energy. CFD simulations necessitate the use of intricate mathematical models and a substantial amount of computational power to accurately describe the behavior of fluids. The implementation of CFD projects can be difficult, and a well-structured approach to project management is required to assure the timely and cost-effective delivery of high-quality results. This paper's objective is to provide an overview of project management in CFD, including its problems, methodologies, and best practices. The study opens with a discussion of the difficulties connected with CFD project management, such as the complexity of the mathematical models, the need for extensive computational resources, and the difficulties associated with validating and verifying the results. In addition, the study examines the project management methodologies typically employed in CFD, such as the Traditional/Waterfall model, Agile and Scrum. Comparisons are made between the advantages and disadvantages of each technique, and suggestions are made for their effective implementation in CFD projects. The study concludes with a discussion of the best practices for project management in CFD, including the utilization of a well-defined project scope, a clear project plan, and effective teamwork. In addition, it highlights the significance of continuous process improvement and the utilization of metrics to monitor progress and discover improvement opportunities. This article is a resource for project managers, researchers, and practitioners in the field of CFD. It can aid in enhancing project outcomes, reducing risks, and enhancing the productivity of CFD projects. This paper provides a complete overview of project management in CFD and is a great resource for individuals who wish to implement efficient project management methods in CFD projects.
Keywords: Project management, Computational Fluid Dynamics, Traditional/Waterfall methodology, agile methodology, scrum methodology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 78114 Maximization of Lifetime for Wireless Sensor Networks Based on Energy Efficient Clustering Algorithm
Authors: Frodouard Minani
Abstract:
Since last decade, wireless sensor networks (WSNs) have been used in many areas like health care, agriculture, defense, military, disaster hit areas and so on. Wireless Sensor Networks consist of a Base Station (BS) and more number of wireless sensors in order to monitor temperature, pressure, motion in different environment conditions. The key parameter that plays a major role in designing a protocol for Wireless Sensor Networks is energy efficiency which is a scarcest resource of sensor nodes and it determines the lifetime of sensor nodes. Maximizing sensor node’s lifetime is an important issue in the design of applications and protocols for Wireless Sensor Networks. Clustering sensor nodes mechanism is an effective topology control approach for helping to achieve the goal of this research. In this paper, the researcher presents an energy efficiency protocol to prolong the network lifetime based on Energy efficient clustering algorithm. The Low Energy Adaptive Clustering Hierarchy (LEACH) is a routing protocol for clusters which is used to lower the energy consumption and also to improve the lifetime of the Wireless Sensor Networks. Maximizing energy dissipation and network lifetime are important matters in the design of applications and protocols for wireless sensor networks. Proposed system is to maximize the lifetime of the Wireless Sensor Networks by choosing the farthest cluster head (CH) instead of the closest CH and forming the cluster by considering the following parameter metrics such as Node’s density, residual-energy and distance between clusters (inter-cluster distance). In this paper, comparisons between the proposed protocol and comparative protocols in different scenarios have been done and the simulation results showed that the proposed protocol performs well over other comparative protocols in various scenarios.
Keywords: Base station, clustering algorithm, energy efficient, wireless sensor networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 84813 Evaluation of the IMERG Product Performance at Estimating the Rainfall Properties in a Semi-Arid Region of Mexico
Authors: Eric Muñoz de la Torre, Julián González Trinidad, Efrén González Ramírez
Abstract:
Rain varies greatly in its duration, intensity, and spatial coverage, it is important to have sub-daily rainfall data for various applications, including risk prevention, however, the ground measurements are limited by the low and irregular density of rain gauges. An alternative to this problem is the Satellite Precipitation Products (SPPs) that use passive microwave and infrared sensors to estimate rainfall, as IMERG, however, these SPPs have to be validated before their application. The aim of this study is to evaluate the performance of the IMERG: Integrated Multi-satellitE Retrievals for Global Precipitation Measurement final run V06B SPP in a semi-arid region of Mexico, using four rain gauges sub-daily data of October 2019 and June to September 2021, using the Minimum inter-event Time (MIT) criterion to separate unique rain events with a dry period of 10 hrs for the purpose of evaluating the rainfall properties (depth, duration and intensity). Point to pixel analysis, continuous, categorical, and volumetric statistical metrics were used. Results show that IMERG is capable to estimate the rainfall depth with a slight overestimation but is unable to identify the real duration and intensity of the rain events, showing moderate overestimations and underestimations, respectively. The study zone presented 80 to 85% of convective rain events, the rest were stratiform rain events, classified by the depth magnitude variation of IMERG pixels and rain gauges. IMERG showed poorer performance at detecting the first ones but had a good performance at estimating stratiform rain events that are originated by Cold Fronts.
Keywords: IMERG, rainfall, rain gauge, remote sensing, statistical evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8112 Load Forecasting in Microgrid Systems with R and Cortana Intelligence Suite
Authors: F. Lazzeri, I. Reiter
Abstract:
Energy production optimization has been traditionally very important for utilities in order to improve resource consumption. However, load forecasting is a challenging task, as there are a large number of relevant variables that must be considered, and several strategies have been used to deal with this complex problem. This is especially true also in microgrids where many elements have to adjust their performance depending on the future generation and consumption conditions. The goal of this paper is to present a solution for short-term load forecasting in microgrids, based on three machine learning experiments developed in R and web services built and deployed with different components of Cortana Intelligence Suite: Azure Machine Learning, a fully managed cloud service that enables to easily build, deploy, and share predictive analytics solutions; SQL database, a Microsoft database service for app developers; and PowerBI, a suite of business analytics tools to analyze data and share insights. Our results show that Boosted Decision Tree and Fast Forest Quantile regression methods can be very useful to predict hourly short-term consumption in microgrids; moreover, we found that for these types of forecasting models, weather data (temperature, wind, humidity and dew point) can play a crucial role in improving the accuracy of the forecasting solution. Data cleaning and feature engineering methods performed in R and different types of machine learning algorithms (Boosted Decision Tree, Fast Forest Quantile and ARIMA) will be presented, and results and performance metrics discussed.
Keywords: Time-series, features engineering methods for forecasting, energy demand forecasting, Azure machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 129011 Towards an Enhanced Quality of IPTV Media Server Architecture over Software Defined Networking
Authors: Esmeralda Hysenbelliu
Abstract:
The aim of this paper is to present the QoE (Quality of Experience) IPTV SDN-based media streaming server enhanced architecture for configuring, controlling, management and provisioning the improved delivery of IPTV service application with low cost, low bandwidth, and high security. Furthermore, it is given a virtual QoE IPTV SDN-based topology to provide an improved IPTV service based on QoE Control and Management of multimedia services functionalities. Inside OpenFlow SDN Controller there are enabled in high flexibility and efficiency Service Load-Balancing Systems; based on the Loading-Balance module and based on GeoIP Service. This two Load-balancing system improve IPTV end-users Quality of Experience (QoE) with optimal management of resources greatly. Through the key functionalities of OpenFlow SDN controller, this approach produced several important features, opportunities for overcoming the critical QoE metrics for IPTV Service like achieving incredible Fast Zapping time (Channel Switching time) < 0.1 seconds. This approach enabled Easy and Powerful Transcoding system via FFMPEG encoder. It has the ability to customize streaming dimensions bitrates, latency management and maximum transfer rates ensuring delivering of IPTV streaming services (Audio and Video) in high flexibility, low bandwidth and required performance. This QoE IPTV SDN-based media streaming architecture unlike other architectures provides the possibility of Channel Exchanging between several IPTV service providers all over the word. This new functionality brings many benefits as increasing the number of TV channels received by end –users with low cost, decreasing stream failure time (Channel Failure time < 0.1 seconds) and improving the quality of streaming services.
Keywords: Improved QoE, OpenFlow SDN controller, IPTV service application, softwarization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 103010 Multi-Sensor Image Fusion for Visible and Infrared Thermal Images
Authors: Amit Kr. Happy
Abstract:
This paper is motivated by the importance of multi-sensor image fusion with specific focus on Infrared (IR) and Visible image (VI) fusion for various applications including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like Visible camera & IR Thermal Imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (IR) that may be reflected or self-emitted. A digital color camera captures the visible source image and a thermal IR camera acquires the thermal source image. In this paper, some image fusion algorithms based upon Multi-Scale Transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, but they also make it hard to become deployed in system and applications that require real-time operation, high flexibility and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.
Keywords: Image fusion, IR thermal imager, multi-sensor, Multi-Scale Transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4319 Rigorous Electromagnetic Model of Fourier Transform Infrared (FT-IR) Spectroscopic Imaging Applied to Automated Histology of Prostate Tissue Specimens
Authors: Rohith K Reddy, David Mayerich, Michael Walsh, P Scott Carney, Rohit Bhargava
Abstract:
Fourier transform infrared (FT-IR) spectroscopic imaging is an emerging technique that provides both chemically and spatially resolved information. The rich chemical content of data may be utilized for computer-aided determinations of structure and pathologic state (cancer diagnosis) in histological tissue sections for prostate cancer. FT-IR spectroscopic imaging of prostate tissue has shown that tissue type (histological) classification can be performed to a high degree of accuracy [1] and cancer diagnosis can be performed with an accuracy of about 80% [2] on a microscopic (≈ 6μm) length scale. In performing these analyses, it has been observed that there is large variability (more than 60%) between spectra from different points on tissue that is expected to consist of the same essential chemical constituents. Spectra at the edges of tissues are characteristically and consistently different from chemically similar tissue in the middle of the same sample. Here, we explain these differences using a rigorous electromagnetic model for light-sample interaction. Spectra from FT-IR spectroscopic imaging of chemically heterogeneous samples are different from bulk spectra of individual chemical constituents of the sample. This is because spectra not only depend on chemistry, but also on the shape of the sample. Using coupled wave analysis, we characterize and quantify the nature of spectral distortions at the edges of tissues. Furthermore, we present a method of performing histological classification of tissue samples. Since the mid-infrared spectrum is typically assumed to be a quantitative measure of chemical composition, classification results can vary widely due to spectral distortions. However, we demonstrate that the selection of localized metrics based on chemical information can make our data robust to the spectral distortions caused by scattering at the tissue boundary.Keywords: Infrared, Spectroscopy, Imaging, Tissue classification
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16358 Dynamic Web-Based 2D Medical Image Visualization and Processing Software
Authors: Abdelhalim. N. Mohammed, Mohammed. Y. Esmail
Abstract:
In the course of recent decades, medical imaging has been dominated by the use of costly film media for review and archival of medical investigation, however due to developments in networks technologies and common acceptance of a standard digital imaging and communication in medicine (DICOM) another approach in light of World Wide Web was produced. Web technologies successfully used in telemedicine applications, the combination of web technologies together with DICOM used to design a web-based and open source DICOM viewer. The Web server allowance to inquiry and recovery of images and the images viewed/manipulated inside a Web browser without need for any preinstalling software. The dynamic site page for medical images visualization and processing created by using JavaScript and HTML5 advancements. The XAMPP ‘apache server’ is used to create a local web server for testing and deployment of the dynamic site. The web-based viewer connected to multiples devices through local area network (LAN) to distribute the images inside healthcare facilities. The system offers a few focal points over ordinary picture archiving and communication systems (PACS): easy to introduce, maintain and independently platforms that allow images to display and manipulated efficiently, the system also user-friendly and easy to integrate with an existing system that have already been making use of web technologies. The wavelet-based image compression technique on which 2-D discrete wavelet transform used to decompose the image then wavelet coefficients are transmitted by entropy encoding after threshold to decrease transmission time, stockpiling cost and capacity. The performance of compression was estimated by using images quality metrics such as mean square error ‘MSE’, peak signal to noise ratio ‘PSNR’ and compression ratio ‘CR’ that achieved (83.86%) when ‘coif3’ wavelet filter is used.Keywords: DICOM, discrete wavelet transform, PACS, HIS, LAN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7957 Development of Requirements Analysis Tool for Medical Autonomy in Long-Duration Space Exploration Missions
Authors: Lara Dutil-Fafard, Caroline Rhéaume, Patrick Archambault, Daniel Lafond, Neal W. Pollock
Abstract:
Improving resources for medical autonomy of astronauts in prolonged space missions, such as a Mars mission, requires not only technology development, but also decision-making support systems. The Advanced Crew Medical System - Medical Condition Requirements study, funded by the Canadian Space Agency, aimed to create knowledge content and a scenario-based query capability to support medical autonomy of astronauts. The key objective of this study was to create a prototype tool for identifying medical infrastructure requirements in terms of medical knowledge, skills and materials. A multicriteria decision-making method was used to prioritize the highest risk medical events anticipated in a long-term space mission. Starting with those medical conditions, event sequence diagrams (ESDs) were created in the form of decision trees where the entry point is the diagnosis and the end points are the predicted outcomes (full recovery, partial recovery, or death/severe incapacitation). The ESD formalism was adapted to characterize and compare possible outcomes of medical conditions as a function of available medical knowledge, skills, and supplies in a given mission scenario. An extensive literature review was performed and summarized in a medical condition database. A PostgreSQL relational database was created to allow query-based evaluation of health outcome metrics with different medical infrastructure scenarios. Critical decision points, skill and medical supply requirements, and probable health outcomes were compared across chosen scenarios. The three medical conditions with the highest risk rank were acute coronary syndrome, sepsis, and stroke. Our efforts demonstrate the utility of this approach and provide insight into the effort required to develop appropriate content for the range of medical conditions that may arise.Keywords: Decision support system, event sequence diagram, exploration mission, medical autonomy, scenario-based queries, space medicine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10496 Electron Density Discrepancy Analysis of Energy Metabolism Coenzymes
Authors: Alan Luo, Hunter N. B. Moseley
Abstract:
Many macromolecular structure entries in the Protein Data Bank (PDB) have a range of regional (localized) quality issues, be it derived from X-ray crystallography, Nuclear Magnetic Resonance (NMR) spectroscopy, or other experimental approaches. However, most PDB entries are judged by global quality metrics like R-factor, R-free, and resolution for X-ray crystallography or backbone phi-psi distribution statistics and average restraint violations for NMR. Regional quality is often ignored when PDB entries are re-used for a variety of structurally based analyses. The binding of ligands, especially ligands involved in energy metabolism, is of particular interest in many structurally focused protein studies. Using a regional quality metric that provides chemically interpretable information from electron density maps, a significant number of outliers in regional structural quality was detected across X-ray crystallographic PDB entries for proteins bound to biochemically critical ligands. In this study, a series of analyses was performed to evaluate both specific and general potential factors that could promote these outliers. In particular, these potential factors were the minimum distance to a metal ion, the minimum distance to a crystal contact, and the isotropic atomic b-factor. To evaluate these potential factors, Fisher’s exact tests were performed, using regional quality criteria of outlier (top 1%, 2.5%, 5%, or 10%) versus non-outlier compared to a potential factor metric above versus below a certain outlier cutoff. The results revealed a consistent general effect from region-specific normalized b-factors but no specific effect from metal ion contact distances and only a very weak effect from crystal contact distance as compared to the b-factor results. These findings indicate that no single specific potential factor explains a majority of the outlier ligand-bound regions, implying that human error is likely as important as these other factors. Thus, all factors, including human error, should be considered when regions of low structural quality are detected. Also, the downstream re-use of protein structures for studying ligand-bound conformations should screen the regional quality of the binding sites. Doing so prevents misinterpretation due to the presence of structural uncertainty or flaws in regions of interest.
Keywords: Biomacromolecular structure, coenzyme, electron density discrepancy analysis, X-ray crystallography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2565 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison
Authors: Xiangtuo Chen, Paul-Henry Cournéde
Abstract:
Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.Keywords: Crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11764 Landscape Pattern Evolution and Optimization Strategy in Wuhan Urban Development Zone, China
Abstract:
With the rapid development of urbanization process in China, its environmental protection pressure is severely tested. So, analyzing and optimizing the landscape pattern is an important measure to ease the pressure on the ecological environment. This paper takes Wuhan Urban Development Zone as the research object, and studies its landscape pattern evolution and quantitative optimization strategy. First, remote sensing image data from 1990 to 2015 were interpreted by using Erdas software. Next, the landscape pattern index of landscape level, class level, and patch level was studied based on Fragstats. Then five indicators of ecological environment based on National Environmental Protection Standard of China were selected to evaluate the impact of landscape pattern evolution on the ecological environment. Besides, the cost distance analysis of ArcGIS was applied to simulate wildlife migration thus indirectly measuring the improvement of ecological environment quality. The result shows that the area of land for construction increased 491%. But the bare land, sparse grassland, forest, farmland, water decreased 82%, 47%, 36%, 25% and 11% respectively. They were mainly converted into construction land. On landscape level, the change of landscape index all showed a downward trend. Number of patches (NP), Landscape shape index (LSI), Connection index (CONNECT), Shannon's diversity index (SHDI), Aggregation index (AI) separately decreased by 2778, 25.7, 0.042, 0.6, 29.2%, all of which indicated that the NP, the degree of aggregation and the landscape connectivity declined. On class level, the construction land and forest, CPLAND, TCA, AI and LSI ascended, but the Distribution Statistics Core Area (CORE_AM) decreased. As for farmland, water, sparse grassland, bare land, CPLAND, TCA and DIVISION, the Patch Density (PD) and LSI descended, yet the patch fragmentation and CORE_AM increased. On patch level, patch area, Patch perimeter, Shape index of water, farmland and bare land continued to decline. The three indexes of forest patches increased overall, sparse grassland decreased as a whole, and construction land increased. It is obvious that the urbanization greatly influenced the landscape evolution. Ecological diversity and landscape heterogeneity of ecological patches clearly dropped. The Habitat Quality Index continuously declined by 14%. Therefore, optimization strategy based on greenway network planning is raised for discussion. This paper contributes to the study of landscape pattern evolution in planning and design and to the research on spatial layout of urbanization.
Keywords: Landscape pattern, optimization strategy, ArcGIS, Erdas, landscape metrics, landscape architecture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8553 Evaluation of the Role of Advocacy and the Quality of Care in Reducing Health Inequalities for People with Autism, Intellectual and Developmental Disabilities at Sheffield Teaching Hospitals
Authors: Jonathan Sahu, Jill Aylott
Abstract:
Individuals with Autism, Intellectual and Developmental disabilities (AIDD) are one of the most vulnerable groups in society, hampered not only by their own limitations to understand and interact with the wider society, but also societal limitations in perception and understanding. Communication to express their needs and wishes is fundamental to enable such individuals to live and prosper in society. This research project was designed as an organisational case study, in a large secondary health care hospital within the National Health Service (NHS), to assess the quality of care provided to people with AIDD and to review the role of advocacy to reduce health inequalities in these individuals. Methods: The research methodology adopted was as an “insider researcher”. Data collection included both quantitative and qualitative data i.e. a mixed method approach. A semi-structured interview schedule was designed and used to obtain qualitative and quantitative primary data from a wide range of interdisciplinary frontline health care workers to assess their understanding and awareness of systems, processes and evidence based practice to offer a quality service to people with AIDD. Secondary data were obtained from sources within the organisation, in keeping with “Case Study” as a primary method, and organisational performance data were then compared against national benchmarking standards. Further data sources were accessed to help evaluate the effectiveness of different types of advocacy that were present in the organisation. This was gauged by measures of user and carer experience in the form of retrospective survey analysis, incidents and complaints. Results: Secondary data demonstrate near compliance of the Organisation with the current national benchmarking standard (Monitor Compliance Framework). However, primary data demonstrate poor knowledge of the Mental Capacity Act 2005, poor knowledge of organisational systems, processes and evidence based practice applied for people with AIDD. In addition there was poor knowledge and awareness of frontline health care workers of advocacy and advocacy schemes for this group. Conclusions: A significant amount of work needs to be undertaken to improve the quality of care delivered to individuals with AIDD. An operational strategy promoting the widespread dissemination of information may not be the best approach to deliver quality care and optimal patient experience and patient advocacy. In addition, a more robust set of standards, with appropriate metrics, needs to be developed to assess organisational performance which will stand the test of professional and public scrutiny.Keywords: Autism, intellectual developmental disabilities, advocacy, health inequalities, quality of care.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8942 Using Statistical Significance and Prediction to Test Long/Short Term Public Services and Patients Cohorts: A Case Study in Scotland
Authors: Sotirios Raptis
Abstract:
Health and Social care (HSc) services planning and scheduling are facing unprecedented challenges, due to the pandemic pressure and also suffer from unplanned spending that is negatively impacted by the global financial crisis. Data-driven approaches can help to improve policies, plan and design services provision schedules using algorithms that assist healthcare managers to face unexpected demands using fewer resources. The paper discusses services packing using statistical significance tests and machine learning (ML) to evaluate demands similarity and coupling. This is achieved by predicting the range of the demand (class) using ML methods such as Classification and Regression Trees (CART), Random Forests (RF), and Logistic Regression (LGR). The significance tests Chi-Squared and Student’s test are used on data over a 39 years span for which data exist for services delivered in Scotland. The demands are associated using probabilities and are parts of statistical hypotheses. These hypotheses, as their NULL part, assume that the target demand is statistically dependent on other services’ demands. This linking is checked using the data. In addition, ML methods are used to linearly predict the above target demands from the statistically found associations and extend the linear dependence of the target’s demand to independent demands forming, thus, groups of services. Statistical tests confirmed ML coupling and made the prediction statistically meaningful and proved that a target service can be matched reliably to other services while ML showed that such marked relationships can also be linear ones. Zero padding was used for missing years records and illustrated better such relationships both for limited years and for the entire span offering long-term data visualizations while limited years periods explained how well patients numbers can be related in short periods of time or that they can change over time as opposed to behaviours across more years. The prediction performance of the associations were measured using metrics such as Receiver Operating Characteristic (ROC), Area Under Curve (AUC) and Accuracy (ACC) as well as the statistical tests Chi-Squared and Student. Co-plots and comparison tables for the RF, CART, and LGR methods as well as the p-value from tests and Information Exchange (IE/MIE) measures are provided showing the relative performance of ML methods and of the statistical tests as well as the behaviour using different learning ratios. The impact of k-neighbours classification (k-NN), Cross-Correlation (CC) and C-Means (CM) first groupings was also studied over limited years and for the entire span. It was found that CART was generally behind RF and LGR but in some interesting cases, LGR reached an AUC = 0 falling below CART, while the ACC was as high as 0.912 showing that ML methods can be confused by zero-padding or by data’s irregularities or by the outliers. On average, 3 linear predictors were sufficient, LGR was found competing well RF and CART followed with the same performance at higher learning ratios. Services were packed only when a significance level (p-value) of their association coefficient was more than 0.05. Social factors relationships were observed between home care services and treatment of old people, low birth weights, alcoholism, drug abuse, and emergency admissions. The work found that different HSc services can be well packed as plans of limited duration, across various services sectors, learning configurations, as confirmed by using statistical hypotheses.
Keywords: Class, cohorts, data frames, grouping, prediction, probabilities, services.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4611 Auto Rickshaw Impacts with Pedestrians: A Computational Analysis of Post-Collision Kinematics and Injury Mechanics
Authors: A. J. Al-Graitti, G. A. Khalid, P. Berthelson, A. Mason-Jones, R. Prabhu, M. D. Jones
Abstract:
Motor vehicle related pedestrian road traffic collisions are a major road safety challenge, since they are a leading cause of death and serious injury worldwide, contributing to a third of the global disease burden. The auto rickshaw, which is a common form of urban transport in many developing countries, plays a major transport role, both as a vehicle for hire and for private use. The most common auto rickshaws are quite unlike ‘typical’ four-wheel motor vehicle, being typically characterised by three wheels, a non-tilting sheet-metal body or open frame construction, a canvas roof and side curtains, a small drivers’ cabin, handlebar controls and a passenger space at the rear. Given the propensity, in developing countries, for auto rickshaws to be used in mixed cityscapes, where pedestrians and vehicles share the roadway, the potential for auto rickshaw impacts with pedestrians is relatively high. Whilst auto rickshaws are used in some Western countries, their limited number and spatial separation from pedestrian walkways, as a result of city planning, has not resulted in significant accident statistics. Thus, auto rickshaws have not been subject to the vehicle impact related pedestrian crash kinematic analyses and/or injury mechanics assessment, typically associated with motor vehicle development in Western Europe, North America and Japan. This study presents a parametric analysis of auto rickshaw related pedestrian impacts by computational simulation, using a Finite Element model of an auto rickshaw and an LS-DYNA 50th percentile male Hybrid III Anthropometric Test Device (dummy). Parametric variables include auto rickshaw impact velocity, auto rickshaw impact region (front, centre or offset) and relative pedestrian impact position (front, side and rear). The output data of each impact simulation was correlated against reported injury metrics, Head Injury Criterion (front, side and rear), Neck injury Criterion (front, side and rear), Abbreviated Injury Scale and reported risk level and adds greater understanding to the issue of auto rickshaw related pedestrian injury risk. The parametric analyses suggest that pedestrians are subject to a relatively high risk of injury during impacts with an auto rickshaw at velocities of 20 km/h or greater, which during some of the impact simulations may even risk fatalities. The present study provides valuable evidence for informing a series of recommendations and guidelines for making the auto rickshaw safer during collisions with pedestrians. Whilst it is acknowledged that the present research findings are based in the field of safety engineering and may over represent injury risk, compared to “Real World” accidents, many of the simulated interactions produced injury response values significantly greater than current threshold curves and thus, justify their inclusion in the study. To reduce the injury risk level and increase the safety of the auto rickshaw, there should be a reduction in the velocity of the auto rickshaw and, or, consideration of engineering solutions, such as retro fitting injury mitigation technologies to those auto rickshaw contact regions which are the subject of the greatest risk of producing pedestrian injury.
Keywords: Auto Rickshaw, finite element analysis, injury risk level, LS-DYNA, pedestrian impact.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1319