Search results for: missing data estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26522

Search results for: missing data estimation

25292 A Small-Scale Flexible Test Bench for the Investigation of Fertigation Strategies in Soilless Culture

Authors: Giacomo Barbieri

Abstract:

In soilless culture, the management of the nutrient solution is the most important aspect for crop growing. Fertigation dose, frequency and nutrient concentration must be planned with the objective of reaching an optimal crop growth by limiting the utilized resources and the associated costs. The definition of efficient fertigation strategies is a complex problem since fertigation requirements vary on the basis of different factors, and crops are sensitive to small variations on fertigation parameters. To the best of author knowledge, a small-scale test bench that is flexible for both nutrient solution preparation and precise irrigation is currently missing, limiting the investigations in standard practices for soilless culture. Starting from the analysis of the state of the art, this paper proposes a small-scale system that is potentially able to concurrently test different fertigation strategies. The system will be designed and implemented throughout a three year project started on August 2018. However, due to the importance of the topic within current challenges as food security and climate change, this work is spread considering that may inspire other universities and organizations.

Keywords: soilless culture, fertigation, test bench, small-scale, automation

Procedia PDF Downloads 177
25291 Cessna Citation X Business Aircraft Stability Analysis Using Linear Fractional Representation LFRs Model

Authors: Yamina Boughari, Ruxandra Mihaela Botez, Florian Theel, Georges Ghazi

Abstract:

Clearance of flight control laws of a civil aircraft is a long and expensive process in the Aerospace industry. Thousands of flight combinations in terms of speeds, altitudes, gross weights, centers of gravity and angles of attack have to be investigated, and proved to be safe. Nonetheless, in this method, a worst flight condition can be easily missed, and its missing would lead to a critical situation. Definitively, it would be impossible to analyze a model because of the infinite number of cases contained within its flight envelope, that might require more time, and therefore more design cost. Therefore, in industry, the technique of the flight envelope mesh is commonly used. For each point of the flight envelope, the simulation of the associated model ensures the satisfaction or not of specifications. In order to perform fast, comprehensive and effective analysis, other varying parameters models were developed by incorporating variations, or uncertainties in the nominal models, known as Linear Fractional Representation LFR models; these LFR models were able to describe the aircraft dynamics by taking into account uncertainties over the flight envelope. In this paper, the LFRs models are developed using the speeds and altitudes as varying parameters; The LFR models were built using several flying conditions expressed in terms of speeds and altitudes. The use of such a method has gained a great interest by the aeronautical companies that have seen a promising future in the modeling, and particularly in the design and certification of control laws. In this research paper, we will focus on the Cessna Citation X open loop stability analysis. The data are provided by a Research Aircraft Flight Simulator of Level D, that corresponds to the highest level flight dynamics certification; this simulator was developed by CAE Inc. and its development was based on the requirements of research at the LARCASE laboratory. The acquisition of these data was used to develop a linear model of the airplane in its longitudinal and lateral motions, and was further used to create the LFR’s models for 12 XCG /weights conditions, and thus the whole flight envelope using a friendly Graphical User Interface developed during this study. Then, the LFR’s models are analyzed using Interval Analysis method based upon Lyapunov function, and also the ‘stability and robustness analysis’ toolbox. The results were presented under the form of graphs, thus they have offered good readability, and were easily exploitable. The weakness of this method stays in a relatively long calculation, equal to about four hours for the entire flight envelope.

Keywords: flight control clearance, LFR, stability analysis, robustness analysis

Procedia PDF Downloads 352
25290 A Conceptual Framework for Vulnerability Assessment of Climate Change Impact on Oil and Gas Critical Infrastructures in the Niger Delta

Authors: Justin A. Udie, Subhes C. Bhatthacharyya, Leticia Ozawa-Meida

Abstract:

The impact of climate change is severe in the Niger Delta and critical oil and gas infrastructures are vulnerable. This is partly due to lack of specific impact assessment framework to assess impact indices on both existing and new infrastructures. The purpose of this paper is to develop a framework for the assessment of climate change impact on critical oil and gas infrastructure in the region. Comparative and documentary methods as well as analysis of frameworks were used to develop a flexible, integrated and conceptual four dimensional framework underpinning; 1. Scoping – the theoretical identification of inherent climate burdens, review of exposure, adaptive capacities and delineation of critical infrastructure; 2. Vulnerability assessment – presents a systematic procedure for the assessment of infrastructure vulnerability. It provides real time re-scoping, practical need for data collection, analysis and review. Physical examination of systems is encouraged to complement the scoped data and ascertain the level of exposure to relevant climate risks in the area; 3. New infrastructure – consider infrastructures that are still at developmental level. It seeks to suggest the inclusion of flexible adaptive capacities in original design of infrastructures in line with climate threats and projections; 4. The Mainstreaming Climate Impact Assessment into government’s environmental decision making approach. Though this framework is designed specifically for the estimation of exposure, adaptive capacities and criticality of vulnerable oil and gas infrastructures in the Niger Delta to climate burdens; it is recommended for researchers and experts as a first-hand generic and practicable tool which can be used for the assessment of other infrastructures perceived as critical and vulnerable. The paper does not provide further tools that synch into the methodological approach but presents pointers upon which a pragmatic methodology can be developed.

Keywords: adaptation, assessment, conceptual, climate, change, framework, vulnerability

Procedia PDF Downloads 318
25289 An Intelligent Traffic Management System Based on the WiFi and Bluetooth Sensing

Authors: Hamed Hossein Afshari, Shahrzad Jalali, Amir Hossein Ghods, Bijan Raahemi

Abstract:

This paper introduces an automated clustering solution that applies to WiFi/Bluetooth sensing data and is later used for traffic management applications. The paper initially summarizes a number of clustering approaches and thereafter shows their performance for noise removal. In this context, clustering is used to recognize WiFi and Bluetooth MAC addresses that belong to passengers traveling by a public urban transit bus. The main objective is to build an intelligent system that automatically filters out MAC addresses that belong to persons located outside the bus for different routes in the city of Ottawa. The proposed intelligent system alleviates the need for defining restrictive thresholds that however reduces the accuracy as well as the range of applicability of the solution for different routes. This paper moreover discusses the performance benefits of the presented clustering approaches in terms of the accuracy, time and space complexity, and the ease of use. Note that results of clustering can further be used for the purpose of the origin-destination estimation of individual passengers, predicting the traffic load, and intelligent management of urban bus schedules.

Keywords: WiFi-Bluetooth sensing, cluster analysis, artificial intelligence, traffic management

Procedia PDF Downloads 242
25288 Corporate Performance and Balance Sheet Indicators: Evidence from Indian Manufacturing Companies

Authors: Hussain Bohra, Pradyuman Sharma

Abstract:

This study highlights the significance of Balance Sheet Indicators on the corporate performance in the case of Indian manufacturing companies. Balance sheet indicators show the actual financial health of the company and it helps to the external investors to choose the right company for their investment and it also help to external financing agency to give easy finance to the manufacturing companies. The period of study is 2000 to 2014 for 813 manufacturing companies for which the continuous data is available throughout the study period. The data is collected from PROWESS data base maintained by Centre for Monitoring Indian Economy Pvt. Ltd. Panel data methods like fixed effect and random effect methods are used for the analysis. The Likelihood Ratio test, Lagrange Multiplier test and Hausman test results proof the suitability of the fixed effect model for the estimation. Return on assets (ROA) is used as the proxy to measure corporate performance. ROA is the best proxy to measure corporate performance as it already used by the most of the authors who worked on the corporate performance. ROA shows return on long term investment projects of firms. Different ratios like Current Ratio, Debt-equity ratio, Receivable turnover ratio, solvency ratio have been used as the proxies for the Balance Sheet Indicators. Other firm specific variable like firm size, and sales as the control variables in the model. From the empirical analysis, it was found that all selected financial ratios have significant and positive impact on the corporate performance. Firm sales and firm size also found significant and positive impact on the corporate performance. To check the robustness of results, the sample was divided on the basis of different ratio like firm having high debt equity ratio and low debt equity ratio, firms having high current ratio and low current ratio, firms having high receivable turnover and low receivable ratio and solvency ratio in the form of firms having high solving ratio and low solvency ratio. We find that the results are robust to all types of companies having different form of selected balance sheet indicators ratio. The results for other variables are also in the same line as for the whole sample. These findings confirm that Balance sheet indicators play as significant role on the corporate performance in India. The findings of this study have the implications for the corporate managers to focus different ratio to maintain the minimum expected level of performance. Apart from that, they should also maintain adequate sales and total assets to improve corporate performance.

Keywords: balance sheet, corporate performance, current ratio, panel data method

Procedia PDF Downloads 267
25287 A Comparison of Methods for Estimating Dichotomous Treatment Effects: A Simulation Study

Authors: Jacqueline Y. Thompson, Sam Watson, Lee Middleton, Karla Hemming

Abstract:

Introduction: The odds ratio (estimated via logistic regression) is a well-established and common approach for estimating covariate-adjusted binary treatment effects when comparing a treatment and control group with dichotomous outcomes. Its popularity is primarily because of its stability and robustness to model misspecification. However, the situation is different for the relative risk and risk difference, which are arguably easier to interpret and better suited to specific designs such as non-inferiority studies. So far, there is no equivalent, widely acceptable approach to estimate an adjusted relative risk and risk difference when conducting clinical trials. This is partly due to the lack of a comprehensive evaluation of available candidate methods. Methods/Approach: A simulation study is designed to evaluate the performance of relevant candidate methods to estimate relative risks to represent conditional and marginal estimation approaches. We consider the log-binomial, generalised linear models (GLM) with iteratively weighted least-squares (IWLS) and model-based standard errors (SE); log-binomial GLM with convex optimisation and model-based SEs; log-binomial GLM with convex optimisation and permutation tests; modified-Poisson GLM IWLS and robust SEs; log-binomial generalised estimation equations (GEE) and robust SEs; marginal standardisation and delta method SEs; and marginal standardisation and permutation test SEs. Independent and identically distributed datasets are simulated from a randomised controlled trial to evaluate these candidate methods. Simulations are replicated 10000 times for each scenario across all possible combinations of sample sizes (200, 1000, and 5000), outcomes (10%, 50%, and 80%), and covariates (ranging from -0.05 to 0.7) representing weak, moderate or strong relationships. Treatment effects (ranging from 0, -0.5, 1; on the log-scale) will consider null (H0) and alternative (H1) hypotheses to evaluate coverage and power in realistic scenarios. Performance measures (bias, mean square error (MSE), relative efficiency, and convergence rates) are evaluated across scenarios covering a range of sample sizes, event rates, covariate prognostic strength, and model misspecifications. Potential Results, Relevance & Impact: There are several methods for estimating unadjusted and adjusted relative risks. However, it is unclear which method(s) is the most efficient, preserves type-I error rate, is robust to model misspecification, or is the most powerful when adjusting for non-prognostic and prognostic covariates. GEE estimations may be biased when the outcome distributions are not from marginal binary data. Also, it seems that marginal standardisation and convex optimisation may perform better than GLM IWLS log-binomial.

Keywords: binary outcomes, statistical methods, clinical trials, simulation study

Procedia PDF Downloads 115
25286 Restricted Boltzmann Machines and Deep Belief Nets for Market Basket Analysis: Statistical Performance and Managerial Implications

Authors: H. Hruschka

Abstract:

This paper presents the first comparison of the performance of the restricted Boltzmann machine and the deep belief net on binary market basket data relative to binary factor analysis and the two best-known topic models, namely Dirichlet allocation and the correlated topic model. This comparison shows that the restricted Boltzmann machine and the deep belief net are superior to both binary factor analysis and topic models. Managerial implications that differ between the investigated models are treated as well. The restricted Boltzmann machine is defined as joint Boltzmann distribution of hidden variables and observed variables (purchases). It comprises one layer of observed variables and one layer of hidden variables. Note that variables of the same layer are not connected. The comparison also includes deep belief nets with three layers. The first layer is a restricted Boltzmann machine based on category purchases. Hidden variables of the first layer are used as input variables by the second-layer restricted Boltzmann machine which then generates second-layer hidden variables. Finally, in the third layer hidden variables are related to purchases. A public data set is analyzed which contains one month of real-world point-of-sale transactions in a typical local grocery outlet. It consists of 9,835 market baskets referring to 169 product categories. This data set is randomly split into two halves. One half is used for estimation, the other serves as holdout data. Each model is evaluated by the log likelihood for the holdout data. Performance of the topic models is disappointing as the holdout log likelihood of the correlated topic model – which is better than Dirichlet allocation - is lower by more than 25,000 compared to the best binary factor analysis model. On the other hand, binary factor analysis on its own is clearly surpassed by both the restricted Boltzmann machine and the deep belief net whose holdout log likelihoods are higher by more than 23,000. Overall, the deep belief net performs best. We also interpret hidden variables discovered by binary factor analysis, the restricted Boltzmann machine and the deep belief net. Hidden variables characterized by the product categories to which they are related differ strongly between these three models. To derive managerial implications we assess the effect of promoting each category on total basket size, i.e., the number of purchased product categories, due to each category's interdependence with all the other categories. The investigated models lead to very different implications as they disagree about which categories are associated with higher basket size increases due to a promotion. Of course, recommendations based on better performing models should be preferred. The impressive performance advantages of the restricted Boltzmann machine and the deep belief net suggest continuing research by appropriate extensions. To include predictors, especially marketing variables such as price, seems to be an obvious next step. It might also be feasible to take a more detailed perspective by considering purchases of brands instead of purchases of product categories.

Keywords: binary factor analysis, deep belief net, market basket analysis, restricted Boltzmann machine, topic models

Procedia PDF Downloads 202
25285 Cryptographic Protocol for Secure Cloud Storage

Authors: Luvisa Kusuma, Panji Yudha Prakasa

Abstract:

Cloud storage, as a subservice of infrastructure as a service (IaaS) in Cloud Computing, is the model of nerworked storage where data can be stored in server. In this paper, we propose a secure cloud storage system consisting of two main components; client as a user who uses the cloud storage service and server who provides the cloud storage service. In this system, we propose the protocol schemes to guarantee against security attacks in the data transmission. The protocols are login protocol, upload data protocol, download protocol, and push data protocol, which implement hybrid cryptographic mechanism based on data encryption before it is sent to the cloud, so cloud storage provider does not know the user's data and cannot analysis user’s data, because there is no correspondence between data and user.

Keywords: cloud storage, security, cryptographic protocol, artificial intelligence

Procedia PDF Downloads 358
25284 Decentralized Data Marketplace Framework Using Blockchain-Based Smart Contract

Authors: Meshari Aljohani, Stephan Olariu, Ravi Mukkamala

Abstract:

Data is essential for enhancing the quality of life. Its value creates chances for users to profit from data sales and purchases. Users in data marketplaces, however, must share and trade data in a secure and trusted environment while maintaining their privacy. The first main contribution of this paper is to identify enabling technologies and challenges facing the development of decentralized data marketplaces. The second main contribution is to propose a decentralized data marketplace framework based on blockchain technology. The proposed framework enables sellers and buyers to transact with more confidence. Using a security deposit, the system implements a unique approach for enforcing honesty in data exchange among anonymous individuals. Before the transaction is considered complete, the system has a time frame. As a result, users can submit disputes to the arbitrators which will review them and respond with their decision. Use cases are presented to demonstrate how these technologies help data marketplaces handle issues and challenges.

Keywords: blockchain, data, data marketplace, smart contract, reputation system

Procedia PDF Downloads 159
25283 Power Quality Modeling Using Recognition Learning Methods for Waveform Disturbances

Authors: Sang-Keun Moon, Hong-Rok Lim, Jin-O Kim

Abstract:

This paper presents a Power Quality (PQ) modeling and filtering processes for the distribution system disturbances using recognition learning methods. Typical PQ waveforms with mathematical applications and gathered field data are applied to the proposed models. The objective of this paper is analyzing PQ data with respect to monitoring, discriminating, and evaluating the waveform of power disturbances to ensure the system preventative system failure protections and complex system problem estimations. Examined signal filtering techniques are used for the field waveform noises and feature extractions. Using extraction and learning classification techniques, the efficiency was verified for the recognition of the PQ disturbances with focusing on interactive modeling methods in this paper. The waveform of selected 8 disturbances is modeled with randomized parameters of IEEE 1159 PQ ranges. The range, parameters, and weights are updated regarding field waveform obtained. Along with voltages, currents have same process to obtain the waveform features as the voltage apart from some of ratings and filters. Changing loads are causing the distortion in the voltage waveform due to the drawing of the different patterns of current variation. In the conclusion, PQ disturbances in the voltage and current waveforms indicate different types of patterns of variations and disturbance, and a modified technique based on the symmetrical components in time domain was proposed in this paper for the PQ disturbances detection and then classification. Our method is based on the fact that obtained waveforms from suggested trigger conditions contain potential information for abnormality detections. The extracted features are sequentially applied to estimation and recognition learning modules for further studies.

Keywords: power quality recognition, PQ modeling, waveform feature extraction, disturbance trigger condition, PQ signal filtering

Procedia PDF Downloads 188
25282 Deterministic and Stochastic Modeling of a Micro-Grid Management for Optimal Power Self-Consumption

Authors: D. Calogine, O. Chau, S. Dotti, O. Ramiarinjanahary, P. Rasoavonjy, F. Tovondahiniriko

Abstract:

Mafate is a natural circus in the north-western part of Reunion Island, without an electrical grid and road network. A micro-grid concept is being experimented in this area, composed of a photovoltaic production combined with electrochemical batteries, in order to meet the local population for self-consumption of electricity demands. This work develops a discrete model as well as a stochastic model in order to reach an optimal equilibrium between production and consumptions for a cluster of houses. The management of the energy power leads to a large linearized programming system, where the time interval of interest is 24 hours The experimental data are solar production, storage energy, and the parameters of the different electrical devices and batteries. The unknown variables to evaluate are the consumptions of the various electrical services, the energy drawn from and stored in the batteries, and the inhabitants’ planning wishes. The objective is to fit the solar production to the electrical consumption of the inhabitants, with an optimal use of the energies in the batteries by satisfying as widely as possible the users' planning requirements. In the discrete model, the different parameters and solutions of the linear programming system are deterministic scalars. Whereas in the stochastic approach, the data parameters and the linear programming solutions become random variables, then the distributions of which could be imposed or established by estimation from samples of real observations or from samples of optimal discrete equilibrium solutions.

Keywords: photovoltaic production, power consumption, battery storage resources, random variables, stochastic modeling, estimations of probability distributions, mixed integer linear programming, smart micro-grid, self-consumption of electricity.

Procedia PDF Downloads 110
25281 Provision of the Amenities Lacking in the Annapur Village Through a Different Government Scheme to Become Self Sustainable Village

Authors: Shalaka Sharad Dixit

Abstract:

Rural Development in India is an important part of the rural economy & empowerment. Almost 70 percent of the Indian population lives in villages, hence rural development is important to become self-sustainable. Hence, the process to aiming the self-reliance of people living in rural areas. Maharashtra is one of the leading states in rural development. Hence, further study of the different villages in the five regions of Maharashtra i.e. Kokan, Pashchim, Marathwada, Khandesh, and Vidharbh. The study shows that major amenities lacking in the village. Annapur village case study has been done. The result has shown that the villagers face major problems like Unemployment, Load shedding, missing education facilities, unavailability of Bank and ATM, etc. They are facing lots of problems because scarce of required amenities. Therefore, the aim is to provide the amenities lacking in the Annapur village through a different government scheme. Government plans are devoted to development that includes the PMGSY, MGNREG, and GRAM UJALA. The study concluded that to provide and fulfill the amenities lacking in the Annapur village with the help of this government initiative.

Keywords: self sustainable rural development, government policies, Annapurna village, amenities, smart village

Procedia PDF Downloads 100
25280 Absorbed Dose Estimation of 68Ga-EDTMP in Human Organs

Authors: S. Zolghadri, H. Yousefnia, A. R. Jalilian

Abstract:

Bone metastases are observed in a wide range of cancers leading to intolerable pain. While early detection can help the physicians in the decision of the type of treatment, various radiopharmaceuticals using phosphonates like 68Ga-EDTMP have been developed. In this work, due to the importance of absorbed dose, human absorbed dose of this new agent was calculated for the first time based on biodistribution data in Wild-type rats. 68Ga was obtained from 68Ge/68Ga generator with radionuclidic purity and radiochemical purity of higher than 99%. The radiolabeled complex was prepared in the optimized conditions. Radiochemical purity of the radiolabeled complex was checked by instant thin layer chromatography (ITLC) method using Whatman No. 2 paper and saline. The results indicated the radiochemical purity of higher than 99%. The radiolabelled complex was injected into the Wild-type rats and its biodistribution was studied up to 120 min. As expected, major accumulation was observed in the bone. Absorbed dose of each human organ was calculated based on biodistribution in the rats using RADAR method. Bone surface and bone marrow with 0.112 and 0.053 mSv/MBq, respectively, received the highest absorbed dose. According to these results, the radiolabeled complex is a suitable and safe option for PET bone imaging.

Keywords: absorbed dose, EDTMP, ⁶⁸Ga, rats

Procedia PDF Downloads 195
25279 Real-Time Scheduling and Control of Supply Chain Networks: Challenges and Graph-Based Solution Approach

Authors: Jens Ehm

Abstract:

Manufacturing in supply chains requires an efficient organisation of production and transport processes in order to guarantee the supply of all partners within the chain with the material that is needed for the reliable fulfilment of tasks. If one partner is not able to supply products for a certain period, these products might be missing as the working material for the customer to perform the next manufacturing step, potentially as supply for further manufacturing steps. This way, local disruptions can influence the whole supply chain. In order to avoid material shortages, an efficient scheduling of tasks is necessary. However, the occurrence of unexpected disruptions cannot be eliminated, so that a modification of the schedule should be arranged as fast as possible. This paper discusses the challenges for the implementation of real-time scheduling and control methods and presents a graph-based approach that enables the integrated scheduling of production and transport processes for multiple supply chain partners and offers the potential for quick adaptations to parts of the initial schedule.

Keywords: production, logistics, integrated scheduling, real-time scheduling

Procedia PDF Downloads 375
25278 On Stochastic Models for Fine-Scale Rainfall Based on Doubly Stochastic Poisson Processes

Authors: Nadarajah I. Ramesh

Abstract:

Much of the research on stochastic point process models for rainfall has focused on Poisson cluster models constructed from either the Neyman-Scott or Bartlett-Lewis processes. The doubly stochastic Poisson process provides a rich class of point process models, especially for fine-scale rainfall modelling. This paper provides an account of recent development on this topic and presents the results based on some of the fine-scale rainfall models constructed from this class of stochastic point processes. Amongst the literature on stochastic models for rainfall, greater emphasis has been placed on modelling rainfall data recorded at hourly or daily aggregation levels. Stochastic models for sub-hourly rainfall are equally important, as there is a need to reproduce rainfall time series at fine temporal resolutions in some hydrological applications. For example, the study of climate change impacts on hydrology and water management initiatives requires the availability of data at fine temporal resolutions. One approach to generating such rainfall data relies on the combination of an hourly stochastic rainfall simulator, together with a disaggregator making use of downscaling techniques. Recent work on this topic adopted a different approach by developing specialist stochastic point process models for fine-scale rainfall aimed at generating synthetic precipitation time series directly from the proposed stochastic model. One strand of this approach focused on developing a class of doubly stochastic Poisson process (DSPP) models for fine-scale rainfall to analyse data collected in the form of rainfall bucket tip time series. In this context, the arrival pattern of rain gauge bucket tip times N(t) is viewed as a DSPP whose rate of occurrence varies according to an unobserved finite state irreducible Markov process X(t). Since the likelihood function of this process can be obtained, by conditioning on the underlying Markov process X(t), the models were fitted with maximum likelihood methods. The proposed models were applied directly to the raw data collected by tipping-bucket rain gauges, thus avoiding the need to convert tip-times to rainfall depths prior to fitting the models. One advantage of this approach was that the use of maximum likelihood methods enables a more straightforward estimation of parameter uncertainty and comparison of sub-models of interest. Another strand of this approach employed the DSPP model for the arrivals of rain cells and attached a pulse or a cluster of pulses to each rain cell. Different mechanisms for the pattern of the pulse process were used to construct variants of this model. We present the results of these models when they were fitted to hourly and sub-hourly rainfall data. The results of our analysis suggest that the proposed class of stochastic models is capable of reproducing the fine-scale structure of the rainfall process, and hence provides a useful tool in hydrological modelling.

Keywords: fine-scale rainfall, maximum likelihood, point process, stochastic model

Procedia PDF Downloads 279
25277 Challenges to Safe and Effective Prescription Writing in the Environment Where Digital Prescribing is Absent

Authors: Prashant Neupane, Asmi Pandey, Mumna Ehsan, Katie Davies, Richard Lowsby

Abstract:

Introduction/Background & aims: Safe and effective prescribing in hospitals, directly and indirectly, impacts the health of the patients. Even though digital prescribing in the National Health Service (NHS), UK has been used in lots of tertiary centers along with district general hospitals, a significant number of NHS trusts are still using paper prescribing. We came across lots of irregularities in our daily clinical practice when we are doing paper prescribing. The main aim of the study was to assess how safely and effectively are we prescribing at our hospital where there is no access to digital prescribing. Method/Summary of work: We conducted a prospective audit in the critical care department at Mid Cheshire Hopsitals NHS Foundation Trust in which 20 prescription charts from different patients were randomly selected over a period of 1 month. We assessed 16 multiple categories from each prescription chart and compared them to the standard trust guidelines on prescription. Results/Discussion: We collected data from 20 different prescription charts. 16 categories were evaluated within each prescription chart. The results showed there was an urgent need for improvement in 8 different sections. In 85% of the prescription chart, all the prescribers who prescribed the medications were not identified. Name, GMC number and signature were absent in the required prescriber identification section of the prescription chart. In 70% of prescription charts, either indication or review date of the antimicrobials was absent. Units of medication were not documented correctly in 65% and the allergic status of the patient was absent in 30% of the charts. The start date of medications was missing and alternations of the medications were not done properly in 35%of charts. The patient's name was not recorded in all desired sections of the chart in 50% of cases and cancellations of the medication were not done properly in 45% of the prescription charts. Conclusion(s): From the audit and data analysis, we assessed the areas in which we needed improvement in prescription writing in the Critical care department. However, during the meetings and conversations with the experts from the pharmacy department, we realized this audit is just a representation of the specialized department of the hospital where access to prescribing is limited to a certain number of prescribers. But if we consider bigger departments of the hospital where patient turnover is much more, the results could be much worse. The findings were discussed in the Critical care MDT meeting where suggestions regarding digital/electronic prescribing were discussed. A poster and presentation regarding safe and effective prescribing were done, awareness poster was prepared and attached alongside every bedside in critical care where it is visible to prescribers. We consider this as a temporary measure to improve the quality of prescribing, however, we strongly believe digital prescribing will help to a greater extent to control weak areas which are seen in paper prescribing.

Keywords: safe prescribing, NHS, digital prescribing, prescription chart

Procedia PDF Downloads 121
25276 Climate Change and Migration in the Semi-arid Tropic and Eastern Regions of India: Exploring Alternative Adaptation Strategies

Authors: Gauri Sreekumar, Sabuj Kumar Mandal

Abstract:

Contributing about 18% to India’s Gross Domestic Product, the agricultural sector plays a significant role in the Indian rural economy. Despite being the primary source of livelihood for more than half of India’s population, most of them are marginal and small farmers facing several challenges due to agro-climatic shocks. Climate change is expected to increase the risk in the regions that are highly agriculture dependent. With systematic and scientific evidence of changes in rainfall, temperature and other extreme climate events, migration started to emerge as a survival strategy for the farm households. In this backdrop, our present study aims to combine the two strands of literature and attempts to explore whether migration is the only adaptation strategy for the farmers once they experience crop failures due adverse climatic condition. Combining the temperature and rainfall information from the weather data provided by the Indian Meteorological Department with the household level panel data on Indian states belonging to the Eastern and Semi-Arid Tropics regions from the Village Dynamics in South Asia (VDSA) collected by the International Crop Research Institute for the Semi-arid Tropics, we form a rich panel data for the years 2010-2014. A Recursive Econometric Model is used to establish the three-way nexus between climate change-yield-migration while addressing the role of irrigation and local non-farm income diversification. Using Three Stage Least Squares Estimation method, we find that climate change induced yield loss is a major driver of farmers’ migration. However, irrigation and local level non-farm income diversification are found to mitigate the adverse impact of climate change on migration. Based on our empirical results, we suggest for enhancing irrigation facilities and making local non-farm income diversification opportunities available to increase farm productivity and thereby reduce farmers’ migration.

Keywords: climate change, migration, adaptation, mitigation

Procedia PDF Downloads 65
25275 Finite Element Simulation of Embankment Bumps at Bridge Approaches, Comparison Study

Authors: F. A. Hassona, M. D. Hashem, R. I. Melek, B. M. Hakeem

Abstract:

A differential settlement at the end of a bridge near the interface between the abutment and the embankment is a persistent problem for highway agencies. The differential settlement produces the common ‘bump at the end of the bridge’. Reduction in steering response, distraction to the driver, added risk and expense to maintenance operation, and reduction in a transportation agency’s public image are all undesirable effects of these uneven and irregular transitions. This paper attempts to simulate the bump at the end of the bridge using PLAXIS finite element 2D program. PLAXIS was used to simulate a laboratory model called Bridge to Embankment Simulator of Transition (B.E.S.T.) device which was built by others to investigate this problem. A total of six numerical simulations were conducted using hardening- soil model with rational assumptions of missing soil parameters to estimate the bump at the end of the bridge. The results show good agreements between the numerical and the laboratory models. Important factors influencing bumps at bridge ends were also addressed in light of the model results.

Keywords: bridge approach slabs, bridge bump, hardening-soil, PLAXIS 2D, settlement

Procedia PDF Downloads 350
25274 Data Mining Approach for Commercial Data Classification and Migration in Hybrid Storage Systems

Authors: Mais Haj Qasem, Maen M. Al Assaf, Ali Rodan

Abstract:

Parallel hybrid storage systems consist of a hierarchy of different storage devices that vary in terms of data reading speed performance. As we ascend in the hierarchy, data reading speed becomes faster. Thus, migrating the application’ important data that will be accessed in the near future to the uppermost level will reduce the application I/O waiting time; hence, reducing its execution elapsed time. In this research, we implement trace-driven two-levels parallel hybrid storage system prototype that consists of HDDs and SSDs. The prototype uses data mining techniques to classify application’ data in order to determine its near future data accesses in parallel with the its on-demand request. The important data (i.e. the data that the application will access in the near future) are continuously migrated to the uppermost level of the hierarchy. Our simulation results show that our data migration approach integrated with data mining techniques reduces the application execution elapsed time when using variety of traces in at least to 22%.

Keywords: hybrid storage system, data mining, recurrent neural network, support vector machine

Procedia PDF Downloads 309
25273 High-Resolution Flood Hazard Mapping Using Two-Dimensional Hydrodynamic Model Anuga: Case Study of Jakarta, Indonesia

Authors: Hengki Eko Putra, Dennish Ari Putro, Tri Wahyu Hadi, Edi Riawan, Junnaedhi Dewa Gede, Aditia Rojali, Fariza Dian Prasetyo, Yudhistira Satya Pribadi, Dita Fatria Andarini, Mila Khaerunisa, Raditya Hanung Prakoswa

Abstract:

Catastrophe risk management can only be done if we are able to calculate the exposed risks. Jakarta is an important city economically, socially, and politically and in the same time exposed to severe floods. On the other hand, flood risk calculation is still very limited in the area. This study has calculated the risk of flooding for Jakarta using 2-Dimensional Model ANUGA. 2-Dimensional model ANUGA and 1-Dimensional Model HEC-RAS are used to calculate the risk of flooding from 13 major rivers in Jakarta. ANUGA can simulate physical and dynamical processes between the streamflow against river geometry and land cover to produce a 1-meter resolution inundation map. The value of streamflow as an input for the model obtained from hydrological analysis on rainfall data using hydrologic model HEC-HMS. The probabilistic streamflow derived from probabilistic rainfall using statistical distribution Log-Pearson III, Normal and Gumbel, through compatibility test using Chi Square and Smirnov-Kolmogorov. Flood event on 2007 is used as a comparison to evaluate the accuracy of model output. Property damage estimations were calculated based on flood depth for 1, 5, 10, 25, 50, and 100 years return period against housing value data from the BPS-Statistics Indonesia, Centre for Research and Development of Housing and Settlements, Ministry of Public Work Indonesia. The vulnerability factor was derived from flood insurance claim. Jakarta's flood loss estimation for the return period of 1, 5, 10, 25, 50, and 100 years, respectively are Rp 1.30 t; Rp 16.18 t; Rp 16.85 t; Rp 21.21 t; Rp 24.32 t; and Rp 24.67 t of the total value of building Rp 434.43 t.

Keywords: 2D hydrodynamic model, ANUGA, flood, flood modeling

Procedia PDF Downloads 277
25272 Social Enterprises in India: Conceptualization and Challenges

Authors: Prajakta Khare

Abstract:

There is a huge number of social enterprises operating in India, across all enterprise sizes and forms addressing diverse social issues. Some cases such as such as Aravind eye care, Narayana Hridalaya, SEWA have been studied extensively in management literature and are known cases in social entrepreneurship. But there are several smaller social enterprises in India that are not called so per se due to the lack of understanding of the concept. There is a lack of academic research on social entrepreneurship in India and the term ‘social entrepreneurship’ is not yet widely known in the country, even by people working in this field as was found by this study. The present study aims to identify the most prominent form of social enterprises in India, the profile of the entrepreneurs, challenges faced, the lessons (theory and practices) emerging from their functioning and finally the factors contributing to the enterprises’ success. This is a preliminary exploratory study using primary data from 30 social enterprises in India. The study used snow ball sampling and a qualitative analysis. Data was collected from founders of social enterprises through written structured questionnaires, open-ended interviews and field visits to enterprises. The sample covered enterprises across sectors such as environment, affordable education, children’s rights, rain water harvesting, women empowerment etc. The interview questions focused on founder’s background and motivation, qualifications, funding, challenges, founder’s understanding and perspectives on social entrepreneurship, government support, linkages with other organizations etc. apart from several others. The interviews were conducted across 3 languages - Hindi, Marathi, English and were then translated and transcribed. 50% of founders were women and 65% of the total founders were highly qualified with a MBA, PhD or MBBS. The most important challenge faced by these entrepreneurs is recruiting skilled people. When asked about their understanding of the term, founders had diverse perspectives. Also, their understandings about the term social enterprise and social entrepreneur were extremely varied. Some founders identified the terms with doing something good for the society, some thought that every business can be called a social enterprise. 35% of the founders were not aware of the term social entrepreneur/ social entrepreneurship. They said that they could identify themselves as social entrepreneurs after discussions with the researcher. The general perception in India is that ‘NGOs are corrupt’- fighting against this perception to secure funds is also another problem as pointed out by some founders. There are unique challenges that social entrepreneurs in India face, as the political, social, economic environment around them is rapidly changing; and getting adequate support from the government is a problem. The research in its subsequent stages aims to clarify existing, missing and new definitions of the term to provide deeper insights in the terminology and issues relating to Social Entrepreneurship in India.

Keywords: challenges, India, social entrepreneurship, social entrepreneurs

Procedia PDF Downloads 468
25271 Use of the Budyko Framework to Estimate the Virtual Water Content in Shijiazhuang Plain, North China

Authors: Enze Zhang

Abstract:

One of the most challenging steps in implementing virtual water content (VWC) analysis of crops is to get properly the total volume of consumptive water use (CWU) and, therefore, the choice of a reliable crop CWU estimation method. In practice, lots of previous researches obtaining CWU of crops follow a classical procedure for calculating crop evapotranspiration which is determined by multiplying reference evapotranspiration by appropriate coefficient, such as crop coefficient and water stress coefficients. However, this manner of calculation requires lots of field experimental data at point scale and more seriously, when current growing conditions differ from the standard conditions, may easily produce deviation between the calculated CWU and the actual CWU. Since evapotranspiration caused by crop planting always plays a vital role in surface water-energy balance in an agricultural region, this study decided to alternatively estimates crop evapotranspiration by Budyko framework. After brief introduce the development process of Budyko framework. We choose a modified Budyko framework under unsteady-state to better evaluated the actual CWU and apply it in an agricultural irrigation area in North China Plain which rely on underground water for irrigation. With the agricultural statistic data, this calculated CWU was further converted into VWC and its subdivision of crops at the annual scale. Results show that all the average values of VWC, VWC_blue and VWC_green show a downward trend with increased agricultural production and improved acreage. By comparison with the previous research, VWC calculated by Budyko framework agree well with part of the previous research and for some other research the value is greater. Our research also suggests that this methodology and findings may be reliable and convenient for investigation of virtual water throughout various agriculture regions of the world.

Keywords: virtual water content, Budyko framework, consumptive water use, crop evapotranspiration

Procedia PDF Downloads 334
25270 Discharge Estimation in a Two Flow Braided Channel Based on Energy Concept

Authors: Amiya Kumar Pati, Spandan Sahu, Kishanjit Kumar Khatua

Abstract:

River is our main source of water which is a form of open channel flow and the flow in the open channel provides with many complex phenomena of sciences that needs to be tackled such as the critical flow conditions, boundary shear stress, and depth-averaged velocity. The development of society, more or less solely depends upon the flow of rivers. The rivers are major sources of many sediments and specific ingredients which are much essential for human beings. A river flow consisting of small and shallow channels sometimes divide and recombine numerous times because of the slow water flow or the built up sediments. The pattern formed during this process resembles the strands of a braid. Braided streams form where the sediment load is so heavy that some of the sediments are deposited as shifting islands. Braided rivers often exist near the mountainous regions and typically carry coarse-grained and heterogeneous sediments down a fairly steep gradient. In this paper, the apparent shear stress formulae were suitably modified, and the Energy Concept Method (ECM) was applied for the prediction of discharges at the junction of a two-flow braided compound channel. The Energy Concept Method has not been applied for estimating the discharges in the braided channels. The energy loss in the channels is analyzed based on mechanical analysis. The cross-section of channel is divided into two sub-areas, namely the main-channel below the bank-full level and region above the bank-full level for estimating the total discharge. The experimental data are compared with a wide range of theoretical data available in the published literature to verify this model. The accuracy of this approach is also compared with Divided Channel Method (DCM). From error analysis of this method, it is observed that the relative error is less for the data-sets having smooth floodplains when compared to rough floodplains. Comparisons with other models indicate that the present method has reasonable accuracy for engineering purposes.

Keywords: critical flow, energy concept, open channel flow, sediment, two-flow braided compound channel

Procedia PDF Downloads 127
25269 Towards a Secure Storage in Cloud Computing

Authors: Mohamed Elkholy, Ahmed Elfatatry

Abstract:

Cloud computing has emerged as a flexible computing paradigm that reshaped the Information Technology map. However, cloud computing brought about a number of security challenges as a result of the physical distribution of computational resources and the limited control that users have over the physical storage. This situation raises many security challenges for data integrity and confidentiality as well as authentication and access control. This work proposes a security mechanism for data integrity that allows a data owner to be aware of any modification that takes place to his data. The data integrity mechanism is integrated with an extended Kerberos authentication that ensures authorized access control. The proposed mechanism protects data confidentiality even if data are stored on an untrusted storage. The proposed mechanism has been evaluated against different types of attacks and proved its efficiency to protect cloud data storage from different malicious attacks.

Keywords: access control, data integrity, data confidentiality, Kerberos authentication, cloud security

Procedia PDF Downloads 335
25268 Violations of Press Freedom

Authors: Khalid Achaat

Abstract:

It is difficult to speak about freedom of the press in Algeria without first talking to fifty-seven journalists killed in the country between 1993 and 1997 and the five missing journalists. No serious investigation was conducted to find the culprits. When a State is not able to guarantee law, there is no justice and violations of the law become "systematic". How to claim the freedom of press in Algeria, when death becomes "banal"? In these circumstances, can we talk of rights of the Algerian press? It is impossible to understand the problems of the press in Algeria, focusing solely legal issues. Take into account technical, financial and political. Their respective roles varies depending on whether one focuses on the collection of information, the regime of the newspaper company or publication and dissemination. Can we say that the Algerian press is "the freest in the Arab world", while the latter reflects only partially the real problems facing the country? While any newspaper company is subject, de facto, to an authorization scheme, permanently subjected to the constant threat of withdrawal of the authorization, suspension, prohibition or closure without it has the right to a remedy? Can it be free when the majority of "media owners", head of the largest daily newspapers are derived from the single party in power since independence? Some of this release does not it serves the interests of the Algerian power?

Keywords: freedom, press, power, closure, suspension

Procedia PDF Downloads 350
25267 Repositioning Religion as a Catalyst for Conflict Resolution in Nigeria

Authors: Samuel A. Muyiwa

Abstract:

Religious chauvinism has attained an alarming status in Contemporary Nigerian society. Arguably, Nigeria is the largest economy and most populous nation in Africa with over 182 million people, the advantages offer by vibrant economy and high population have been sacrificed on the altar of religion. Tolerance, sacrifice, humility, compassion, love, justice, trustworthiness, dedication to the well-being of others, and unity are the universal spiritual principles that lie at the heart of any religion either Christianity or Islam even traditional. Whereas traditional religious practices foreground the beliefs, norms and ritual that are related to the sacred being God because of its quick and immediate consequence of its effect, the new-found religious sentiments have deviated from the norms, thus undermining cosmic harmony in Nigeria because of its long-time consequence of its effect. Religion, which is expected to accelerate growth and motivate people to develop spiritual nuances for the betterment of their communities, has, however occasioned conflict and violence in Nigeria socio-political cosmo. Therefore, this study examines the content of religion in the promotion of peace and unity and its contextual missing link in the promotion of conflict and violence in Nigeria.

Keywords: religion chauvinism, Nigeria, conflict, conflict resolution

Procedia PDF Downloads 319
25266 Estimation of the Parameters of Muskingum Methods for the Prediction of the Flood Depth in the Moudjar River Catchment

Authors: Fares Laouacheria, Said Kechida, Moncef Chabi

Abstract:

The objective of the study was based on the hydrological routing modelling for the continuous monitoring of the hydrological situation in the Moudjar river catchment, especially during floods with Hydrologic Engineering Center–Hydrologic Modelling Systems (HEC-HMS). The HEC-GeoHMS was used to transform data from geographic information system (GIS) to HEC-HMS for delineating and modelling the catchment river in order to estimate the runoff volume, which is used as inputs to the hydrological routing model. Two hydrological routing models were used, namely Muskingum and Muskingum routing models, for conducting this study. In this study, a comparison between the parameters of the Muskingum and Muskingum-Cunge routing models in HEC-HMS was used for modelling flood routing in the Moudjar river catchment and determining the relationship between these parameters and the physical characteristics of the river. The results indicate that the effects of input parameters such as the weighting factor "X" and travel time "K" on the output results are more significant, where the Muskingum routing model was more sensitive to input parameters than the Muskingum-Cunge routing model. This study can contribute to understand and improve the knowledge of the mechanisms of river floods, especially in ungauged river catchments.

Keywords: HEC-HMS, hydrological modelling, Muskingum routing model, Muskingum-Cunge routing model

Procedia PDF Downloads 278
25265 Empowering Rangatahi: Amplifying Youth Voices on Smartphone and Social Media Use in Aotearoa New Zealand

Authors: Melissa L Gould

Abstract:

The uptick in social media users during the COVID-19 lockdowns has accelerated concerns about cellphone addiction, cyberbullying, and exposure to harmful content, particularly mis- and disinformation and extremist content. The validity of these concerns is synthesized for media technologists to expose the strategies behind social media and search platform technology and explain why they restrict their children from using it. Banning cell phones in schools, increasing age limits on social media accounts, and putting warning labels on social media are some of the solutions proposed to protect young people from smartphones and social media. Largely missing from these conversations are the voices of young people (rangatahi). Instead, their lived experiences are being told and managed by adults. This presentation will outline my research that amplified the voices and lived experiences of young people by positioning them as experts. Using The Social Dilemma as a discussion prompt, the focus groups of rangatahi in Aotearoa, New Zealand, provide a space for young people to articulate their own lived experiences and respond to the dominant narratives on their generation's use of smartphones and social media.

Keywords: social media, smart phones, young people, social dilemma

Procedia PDF Downloads 34
25264 Model Based Fault Diagnostic Approach for Limit Switches

Authors: Zafar Mahmood, Surayya Naz, Nazir Shah Khattak

Abstract:

The degree of freedom relates to our capability to observe or model the energy paths within the system. Higher the number of energy paths being modeled leaves to us a higher degree of freedom, but increasing the time and modeling complexity rendering it useless for today’s world’s need for minimum time to market. Since the number of residuals that can be uniquely isolated are dependent on the number of independent outputs of the system, increasing the number of sensors required. The examples of discrete position sensors that may be used to form an array include limit switches, Hall effect sensors, optical sensors, magnetic sensors, etc. Their mechanical design can usually be tailored to fit in the transitional path of an STME in a variety of mechanical configurations. The case studies into multi-sensor system were carried out and actual data from sensors is used to test this generic framework. It is being investigated, how the proper modeling of limit switches as timing sensors, could lead to unified and neutral residual space while keeping the implementation cost reasonably low.

Keywords: low-cost limit sensors, fault diagnostics, Single Throw Mechanical Equipment (STME), parameter estimation, parity-space

Procedia PDF Downloads 618
25263 Digital Material Characterization Using the Quantum Fourier Transform

Authors: Felix Givois, Nicolas R. Gauger, Matthias Kabel

Abstract:

The efficient digital material characterization is of great interest to many fields of application. It consists of the following three steps. First, a 3D reconstruction of 2D scans must be performed. Then, the resulting gray-value image of the material sample is enhanced by image processing methods. Finally, partial differential equations (PDE) are solved on the segmented image, and by averaging the resulting solutions fields, effective properties like stiffness or conductivity can be computed. Due to the high resolution of current CT images, the latter is typically performed with matrix-free solvers. Among them, a solver that uses the explicit formula of the Green-Eshelby operator in Fourier space has been proposed by Moulinec and Suquet. Its algorithmic, most complex part is the Fast Fourier Transformation (FFT). In our talk, we will discuss the potential quantum advantage that can be obtained by replacing the FFT with the Quantum Fourier Transformation (QFT). We will especially show that the data transfer for noisy intermediate-scale quantum (NISQ) devices can be improved by using appropriate boundary conditions for the PDE, which also allows using semi-classical versions of the QFT. In the end, we will compare the results of the QFT-based algorithm for simple geometries with the results of the FFT-based homogenization method.

Keywords: most likelihood amplitude estimation (MLQAE), numerical homogenization, quantum Fourier transformation (QFT), NISQ devises

Procedia PDF Downloads 79