Search results for: cointegration approach in panel data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 34619

Search results for: cointegration approach in panel data

33959 Sustainable Building Technologies for Post-Disaster Temporary Housing: Integrated Sustainability Assessment and Life Cycle Assessment

Authors: S. M. Amin Hosseini, Oriol Pons, Albert de la Fuente

Abstract:

After natural disasters, displaced people (DP) require important numbers of housing units, which have to be erected quickly due to emergency pressures. These tight timeframes can cause the multiplication of the environmental construction impacts. These negative impacts worsen the already high energy consumption and pollution caused by the building sector. Indeed, post-disaster housing, which is often carried out without pre-planning, usually causes high negative environmental impacts, besides other economic and social impacts. Therefore, it is necessary to establish a suitable strategy to deal with this problem which also takes into account the instability of its causes, like changing ratio between rural and urban population. To this end, this study aims to present a model that assists decision-makers to choose the most suitable building technology for post-disaster housing units. This model focuses on the alternatives sustainability and fulfillment of the stakeholders’ satisfactions. Four building technologies have been analyzed to determine the most sustainability technology and to validate the presented model. In 2003, Bam earthquake DP had their temporary housing units (THUs) built using these four technologies: autoclaved aerated concrete blocks (AAC), concrete masonry unit (CMU), pressed reeds panel (PR), and 3D sandwich panel (3D). The results of this analysis confirm that PR and CMU obtain the highest sustainability indexes. However, the second life scenario of THUs could have considerable impacts on the results.

Keywords: sustainability, post-disaster temporary housing, integrated value model for sustainability assessment, life cycle assessment

Procedia PDF Downloads 255
33958 Lean Models Classification: Towards a Holistic View

Authors: Y. Tiamaz, N. Souissi

Abstract:

The purpose of this paper is to present a classification of Lean models which aims to capture all the concepts related to this approach and thus facilitate its implementation. This classification allows the identification of the most relevant models according to several dimensions. From this perspective, we present a review and an analysis of Lean models literature and we propose dimensions for the classification of the current proposals while respecting among others the axes of the Lean approach, the maturity of the models as well as their application domains. This classification allowed us to conclude that researchers essentially consider the Lean approach as a toolbox also they design their models to solve problems related to a specific environment. Since Lean approach is no longer intended only for the automotive sector where it was invented, but to all fields (IT, Hospital, ...), we consider that this approach requires a generic model that is capable of being implemented in all areas.

Keywords: lean approach, lean models, classification, dimensions, holistic view

Procedia PDF Downloads 435
33957 Assessment of Climate Change Impact on Meteorological Droughts

Authors: Alireza Nikbakht Shahbazi

Abstract:

There are various factors that affect climate changes; drought is one of those factors. Investigation of efficient methods for estimating climate change impacts on drought should be assumed. The aim of this paper is to investigate climate change impacts on drought in Karoon3 watershed located south-western Iran in the future periods. The atmospheric general circulation models (GCM) data under Intergovernmental Panel on Climate Change (IPCC) scenarios should be used for this purpose. In this study, watershed drought under climate change impacts will be simulated in future periods (2011 to 2099). Standard precipitation index (SPI) as a drought index was selected and calculated using mean monthly precipitation data in Karoon3 watershed. SPI was calculated in 6, 12 and 24 months periods. Statistical analysis on daily precipitation and minimum and maximum daily temperature was performed. LRAS-WG5 was used to determine the feasibility of future period's meteorological data production. Model calibration and verification was performed for the base year (1980-2007). Meteorological data simulation for future periods under General Circulation Models and climate change IPCC scenarios was performed and then the drought status using SPI under climate change effects analyzed. Results showed that differences between monthly maximum and minimum temperature will decrease under climate change and spring precipitation shall increase while summer and autumn rainfall shall decrease. The precipitation occurs mainly between January and May in future periods and summer or autumn precipitation decline and lead up to short term drought in the study region. Normal and wet SPI category is more frequent in B1 and A2 emissions scenarios than A1B.

Keywords: climate change impact, drought severity, drought frequency, Karoon3 watershed

Procedia PDF Downloads 243
33956 Geospatial Information for Smart City Development

Authors: Simangele Dlamini

Abstract:

Smart city development is seen as a way of facing the challenges brought about by the growing urban population the world over. Research indicates that cities have a role to play in combating urban challenges like crime, waste disposal, greenhouse gas emissions, and resource efficiency. These solutions should be such that they do not make city management less sustainable but should be solutions-driven, cost and resource-efficient, and smart. This study explores opportunities on how the City of Johannesburg, South Africa, can use Geographic Information Systems, Big Data and the Internet of Things (IoT) in identifying opportune areas to initiate smart city initiatives such as smart safety, smart utilities, smart mobility, and smart infrastructure in an integrated manner. The study will combine Big Data, using real-time data sources to identify hotspot areas that will benefit from ICT interventions. The GIS intervention will assist the city in avoiding a silo approach in its smart city development initiatives, an approach that has led to the failure of smart city development in other countries.

Keywords: smart cities, internet of things, geographic information systems, johannesburg

Procedia PDF Downloads 150
33955 Anomaly Detection Based Fuzzy K-Mode Clustering for Categorical Data

Authors: Murat Yazici

Abstract:

Anomalies are irregularities found in data that do not adhere to a well-defined standard of normal behavior. The identification of outliers or anomalies in data has been a subject of study within the statistics field since the 1800s. Over time, a variety of anomaly detection techniques have been developed in several research communities. The cluster analysis can be used to detect anomalies. It is the process of associating data with clusters that are as similar as possible while dissimilar clusters are associated with each other. Many of the traditional cluster algorithms have limitations in dealing with data sets containing categorical properties. To detect anomalies in categorical data, fuzzy clustering approach can be used with its advantages. The fuzzy k-Mode (FKM) clustering algorithm, which is one of the fuzzy clustering approaches, by extension to the k-means algorithm, is reported for clustering datasets with categorical values. It is a form of clustering: each point can be associated with more than one cluster. In this paper, anomaly detection is performed on two simulated data by using the FKM cluster algorithm. As a significance of the study, the FKM cluster algorithm allows to determine anomalies with their abnormality degree in contrast to numerous anomaly detection algorithms. According to the results, the FKM cluster algorithm illustrated good performance in the anomaly detection of data, including both one anomaly and more than one anomaly.

Keywords: fuzzy k-mode clustering, anomaly detection, noise, categorical data

Procedia PDF Downloads 55
33954 Positioning a Southern Inclusive Framework Embedded in the Social Model of Disability Theory Contextualised for Guyana

Authors: Lidon Lashley

Abstract:

This paper presents how the social model of disability can be used to reshape inclusive education practices in Guyana. Inclusive education in Guyana is metamorphosizing but still firmly held in the tenets of the Medical Model of Disability which influences the experiences of children with Special Education Needs and/or Disabilities (SEN/D). An ethnographic approach to data gathering was employed in this study. Qualitative data was gathered from the voices of children with and without SEN/D as well as their mainstream teachers to present the interplay of discourses and subjectivities in the situation. The data was analyzed using Adele Clarke's postmodern approach to grounded theory analysis called situational analysis. The data suggest that it is possible but will be challenging to fully contextualize and adopt Loreman's synthesis and Booths and Ainscow's Index in the two mainstream schools studied. In addition, the data paved the way for the presentation of the social model framework specific to Guyana called 'Southern Inclusive Education Framework for Guyana' and its support tool called 'The Inclusive Checker created for Southern mainstream primary classrooms.

Keywords: social model of disability, medical model of disability, subjectivities, metamorphosis, special education needs, postcolonial Guyana, inclusion, culture, mainstream primary schools, Loreman's synthesis, Booths and Ainscow's index

Procedia PDF Downloads 162
33953 Short-Term Forecast of Wind Turbine Production with Machine Learning Methods: Direct Approach and Indirect Approach

Authors: Mamadou Dione, Eric Matzner-lober, Philippe Alexandre

Abstract:

The Energy Transition Act defined by the French State has precise implications on Renewable Energies, in particular on its remuneration mechanism. Until then, a purchase obligation contract permitted the sale of wind-generated electricity at a fixed rate. Tomorrow, it will be necessary to sell this electricity on the Market (at variable rates) before obtaining additional compensation intended to reduce the risk. This sale on the market requires to announce in advance (about 48 hours before) the production that will be delivered on the network, so to be able to predict (in the short term) this production. The fundamental problem remains the variability of the Wind accentuated by the geographical situation. The objective of the project is to provide, every day, short-term forecasts (48-hour horizon) of wind production using weather data. The predictions of the GFS model and those of the ECMWF model are used as explanatory variables. The variable to be predicted is the production of a wind farm. We do two approaches: a direct approach that predicts wind generation directly from weather data, and an integrated approach that estimâtes wind from weather data and converts it into wind power by power curves. We used machine learning techniques to predict this production. The models tested are random forests, CART + Bagging, CART + Boosting, SVM (Support Vector Machine). The application is made on a wind farm of 22MW (11 wind turbines) of the Compagnie du Vent (that became Engie Green France). Our results are very conclusive compared to the literature.

Keywords: forecast aggregation, machine learning, spatio-temporal dynamics modeling, wind power forcast

Procedia PDF Downloads 219
33952 An Effective Approach to Knowledge Capture in Whole Life Costing in Constructions Project

Authors: Ndibarafinia Young Tobin, Simon Burnett

Abstract:

In spite of the benefits of implementing whole life costing technique as a valuable approach for comparing alternative building designs allowing operational cost benefits to be evaluated against any initial cost increases and also as part of procurement in the construction industry, its adoption has been relatively slow due to the lack of tangible evidence, ‘know-how’ skills and knowledge of the practice, i.e. the lack of professionals in many establishments with knowledge and training on the use of whole life costing technique, this situation is compounded by the absence of available data on whole life costing from relevant projects, lack of data collection mechanisms and so on. This has proved to be very challenging to those who showed some willingness to employ the technique in a construction project. The knowledge generated from a project can be considered as best practices learned on how to carry out tasks in a more efficient way, or some negative lessons learned which have led to losses and slowed down the progress of the project and performance. Knowledge management in whole life costing practice can enhance whole life costing analysis execution in a construction project, as lessons learned from one project can be carried on to future projects, resulting in continuous improvement, providing knowledge that can be used in the operation and maintenance phases of an assets life span. Purpose: The purpose of this paper is to report an effective approach which can be utilised in capturing knowledge in whole life costing practice in a construction project. Design/methodology/approach: An extensive literature review was first conducted on the concept of knowledge management and whole life costing. This was followed by a semi-structured interview to explore the existing and good practice knowledge management in whole life costing practice in a construction project. The data gathered from the semi-structured interview was analyzed using content analysis and used to structure an effective knowledge capturing approach. Findings: From the results obtained in the study, it shows that the practice of project review is the common method used in the capturing of knowledge and should be undertaken in an organized and accurate manner, and results should be presented in the form of instructions or in a checklist format, forming short and precise insights. The approach developed advised that irrespective of how effective the approach to knowledge capture, the absence of an environment for sharing knowledge, would render the approach ineffective. Open culture and resources are critical for providing a knowledge sharing setting, and leadership has to sustain whole life costing knowledge capture, giving full support for its implementation. The knowledge capturing approach has been evaluated by practitioners who are experts in the area of whole life costing practice. The results have indicated that the approach to knowledge capture is suitable and efficient.

Keywords: whole life costing, knowledge capture, project review, construction industry, knowledge management

Procedia PDF Downloads 260
33951 A Visual Analytics Tool for the Structural Health Monitoring of an Aircraft Panel

Authors: F. M. Pisano, M. Ciminello

Abstract:

Aerospace, mechanical, and civil engineering infrastructures can take advantages from damage detection and identification strategies in terms of maintenance cost reduction and operational life improvements, as well for safety scopes. The challenge is to detect so called “barely visible impact damage” (BVID), due to low/medium energy impacts, that can progressively compromise the structure integrity. The occurrence of any local change in material properties, that can degrade the structure performance, is to be monitored using so called Structural Health Monitoring (SHM) systems, in charge of comparing the structure states before and after damage occurs. SHM seeks for any "anomalous" response collected by means of sensor networks and then analyzed using appropriate algorithms. Independently of the specific analysis approach adopted for structural damage detection and localization, textual reports, tables and graphs describing possible outlier coordinates and damage severity are usually provided as artifacts to be elaborated for information extraction about the current health conditions of the structure under investigation. Visual Analytics can support the processing of monitored measurements offering data navigation and exploration tools leveraging the native human capabilities of understanding images faster than texts and tables. Herein, a SHM system enrichment by integration of a Visual Analytics component is investigated. Analytical dashboards have been created by combining worksheets, so that a useful Visual Analytics tool is provided to structural analysts for exploring the structure health conditions examined by a Principal Component Analysis based algorithm.

Keywords: interactive dashboards, optical fibers, structural health monitoring, visual analytics

Procedia PDF Downloads 125
33950 Differences Choosing Closed Approach or Open Approach in Rhinoplasty Outcomes

Authors: Alessandro Marano

Abstract:

Aim: The author describes a strategy for choosing between two different rhinoplasty approaches for outcomes treatment. Methods: Series of the case study. There are advantages and disadvantages on both approaches for rhinoplasty. On the side of the open approach, we are be able to better manage the techniques for shaping and restoring nasal structures in rhinoplasty outcomes; on the other side, the closed approach requires more practice and experience to achieve good results. Results: Author’s choice is the closed approach on rhinoplasty outcomes. Anyway, the open approach is most commonly preferred due to superior management and better vision on nasal structures. Conclusions: Both approaches are valid for the treatment of rhinoplasty outcomes, author's preferred approach is closed, with minimally invasive modification focused on restoring outcomes in nasal function and aesthetics.

Keywords: rhinoplasty, aesthetic, face, outcomes

Procedia PDF Downloads 112
33949 Inappropriate Prescribing Defined by START and STOPP Criteria and Its Association with Adverse Drug Events among Older Hospitalized Patients

Authors: Mohd Taufiq bin Azmy, Yahaya Hassan, Shubashini Gnanasan, Loganathan Fahrni

Abstract:

Inappropriate prescribing in older patients has been associated with resource utilization and adverse drug events (ADE) such as hospitalization, morbidity and mortality. Globally, there is a lack of published data on ADE induced by inappropriate prescribing. Our study is specific to an older population and is aimed at identifying risk factors for ADE and to develop a model that will link ADE to inappropriate prescribing. The design of the study was prospective whereby computerized medical records of 302 hospitalized elderly aged 65 years and above in 3 public hospitals in Malaysia (Hospital Serdang, Hospital Selayang and Hospital Sungai Buloh) were studied over a 7 month period from September 2013 until March 2014. Potentially inappropriate medications and potential prescribing omissions were determined using the published and validated START-STOPP criteria. Patients who had at least one inappropriate medication were included in Phase II of the study where ADE were identified by local expert consensus panel based on the published and validated Naranjo ADR probability scale. The panel also assessed whether ADE were causal or contributory to current hospitalization. The association between inappropriate prescribing and ADE (hospitalization, mortality and adverse drug reactions) was determined by identifying whether or not the former was causal or contributory to the latter. Rate of ADE avoidability was also determined. Our findings revealed that the prevalence of potential inappropriate prescribing was 58.6%. A total of ADEs were detected in 31 of 105 patients (29.5%) when STOPP criteria were used to identify potentially inappropriate medication; All of the 31 ADE (100%) were considered causal or contributory to admission. Of the 31 ADEs, 28 (90.3%) were considered avoidable or potentially avoidable. After adjusting for age, sex, comorbidity, dementia, baseline activities of daily living function, and number of medications, the likelihood of a serious avoidable ADE increased significantly when a potentially inappropriate medication was prescribed (odds ratio, 11.18; 95% confidence interval [CI], 5.014 - 24.93; p < .001). The medications identified by STOPP criteria, are significantly associated with avoidable ADE in older people that cause or contribute to urgent hospitalization but contributed less towards morbidity and mortality. Findings of the study underscore the importance of preventing inappropriate prescribing.

Keywords: adverse drug events, appropriate prescribing, health services research

Procedia PDF Downloads 401
33948 Advancing the Analysis of Physical Activity Behaviour in Diverse, Rapidly Evolving Populations: Using Unsupervised Machine Learning to Segment and Cluster Accelerometer Data

Authors: Christopher Thornton, Niina Kolehmainen, Kianoush Nazarpour

Abstract:

Background: Accelerometers are widely used to measure physical activity behavior, including in children. The traditional method for processing acceleration data uses cut points, relying on calibration studies that relate the quantity of acceleration to energy expenditure. As these relationships do not generalise across diverse populations, they must be parametrised for each subpopulation, including different age groups, which is costly and makes studies across diverse populations difficult. A data-driven approach that allows physical activity intensity states to emerge from the data under study without relying on parameters derived from external populations offers a new perspective on this problem and potentially improved results. We evaluated the data-driven approach in a diverse population with a range of rapidly evolving physical and mental capabilities, namely very young children (9-38 months old), where this new approach may be particularly appropriate. Methods: We applied an unsupervised machine learning approach (a hidden semi-Markov model - HSMM) to segment and cluster the accelerometer data recorded from 275 children with a diverse range of physical and cognitive abilities. The HSMM was configured to identify a maximum of six physical activity intensity states and the output of the model was the time spent by each child in each of the states. For comparison, we also processed the accelerometer data using published cut points with available thresholds for the population. This provided us with time estimates for each child’s sedentary (SED), light physical activity (LPA), and moderate-to-vigorous physical activity (MVPA). Data on the children’s physical and cognitive abilities were collected using the Paediatric Evaluation of Disability Inventory (PEDI-CAT). Results: The HSMM identified two inactive states (INS, comparable to SED), two lightly active long duration states (LAS, comparable to LPA), and two short-duration high-intensity states (HIS, comparable to MVPA). Overall, the children spent on average 237/392 minutes per day in INS/SED, 211/129 minutes per day in LAS/LPA, and 178/168 minutes in HIS/MVPA. We found that INS overlapped with 53% of SED, LAS overlapped with 37% of LPA and HIS overlapped with 60% of MVPA. We also looked at the correlation between the time spent by a child in either HIS or MVPA and their physical and cognitive abilities. We found that HIS was more strongly correlated with physical mobility (R²HIS =0.5, R²MVPA= 0.28), cognitive ability (R²HIS =0.31, R²MVPA= 0.15), and age (R²HIS =0.15, R²MVPA= 0.09), indicating increased sensitivity to key attributes associated with a child’s mobility. Conclusion: An unsupervised machine learning technique can segment and cluster accelerometer data according to the intensity of movement at a given time. It provides a potentially more sensitive, appropriate, and cost-effective approach to analysing physical activity behavior in diverse populations, compared to the current cut points approach. This, in turn, supports research that is more inclusive across diverse populations.

Keywords: physical activity, machine learning, under 5s, disability, accelerometer

Procedia PDF Downloads 212
33947 Assessing Available Power from a Renewable Energy Source in the Southern Hemisphere using Anisotropic Model

Authors: Asowata Osamede, Trudy Sutherland

Abstract:

The purpose of this paper is to assess the available power from a Renewable Energy Source (off-grid photovoltaic (PV) panel) in the Southern Hemisphere using anisotropic model. Direct solar radiation is the driving force in photovoltaics. In a basic PV panels in the Southern Hemisphere, Power conversion is eminent, and this is achieved by the PV cells converting solar energy into electrical energy. In this research, the results was determined for a 6 month period from September 2022 through February 2023. Preliminary results, which include Normal Probability plot, data analysis - R2 value, effective conversion-time per week and work-time per day, indicate a favorably comparison between the empirical results and the simulation results.

Keywords: power-conversion, mathematical model, PV panels, DC-DC converters, direct solar radiation

Procedia PDF Downloads 86
33946 Uncertainty and Volatility in Middle East and North Africa Stock Market during the Arab Spring

Authors: Ameen Alshugaa, Abul Mansur Masih

Abstract:

This paper sheds light on the economic impacts of political uncertainty caused by the civil uprisings that swept the Arab World and have been collectively known as the Arab Spring. Measuring documented effects of political uncertainty on regional stock market indices, we examine the impact of the Arab Spring on the volatility of stock markets in eight countries in the Middle East and North Africa (MENA) region: Egypt, Lebanon, Jordon, United Arab Emirate, Qatar, Bahrain, Oman and Kuwait. This analysis also permits testing the existence of financial contagion among equity markets in the MENA region during the Arab Spring. To capture the time-varying and multi-horizon nature of the evidence of volatility and contagion in the eight MENA stock markets, we apply two robust methodologies on consecutive data from November 2008 to March 2014: MGARCH-DCC, Continuous Wavelet Transforms (CWT). Our results indicate two key findings. First, the discrepancies between volatile stock markets of countries directly impacted by the Arab Spring and countries that were not directly impacted indicate that international investors may still enjoy portfolio diversification and investment in MENA markets. Second, the lack of financial contagion during the Arab Spring suggests that there is little evidence of cointegration among MENA markets. Providing a general analysis of the economic situation and the investment climate in the MENA region during and after the Arab Spring, this study bear significant importance for policy makers, local and international investors, and market regulators.

Keywords: Portfolio Diversification , MENA Region , Stock Market Indices, MGARCH-DCC, Wavelet Analysis, CWT

Procedia PDF Downloads 292
33945 Supplier Selection by Considering Cost and Reliability

Authors: K. -H. Yang

Abstract:

Supplier selection problem is one of the important issues of supply chain problems. Two categories of methodologies include qualitative and quantitative approaches which can be applied to supplier selection problems. However, due to the complexities of the problem and lacking of reliable and quantitative data, qualitative approaches are more than quantitative approaches. This study considers operational cost and supplier’s reliability factor and solves the problem by using a quantitative approach. A mixed integer programming model is the primary analytic tool. Analyses of different scenarios with variable cost and reliability structures show that the effectiveness of this approach to the supplier selection problem.

Keywords: mixed integer programming, quantitative approach, supplier’s reliability, supplier selection

Procedia PDF Downloads 384
33944 Factors of Non-Conformity Behavior and the Emergence of a Ponzi Game in the Riba-Free (Interest-Free) Banking System of Iran

Authors: Amir Hossein Ghaffari Nejad, Forouhar Ferdowsi, Reza Mashhadi

Abstract:

In the interest-free banking system of Iran, the savings of society are in the form of bank deposits, and banks using the Islamic contracts, allocate the resources to applicants for obtaining facilities and credit. In the meantime, the central bank, with the aim of introducing monetary policy, determines the maximum interest rate on bank deposits in terms of macroeconomic requirements. But in recent years, the country's economic constraints with the stagflation and the consequence of the institutional weaknesses of the financial market of Iran have resulted in massive disturbances in the balance sheet of the banking system, resulting in a period of mismatch maturity in the banks' assets and liabilities and the implementation of a Ponzi game. This issue caused determination of the interest rate in long-term bank deposit contracts to be associated with non-observance of the maximum rate set by the central bank. The result of this condition was in the allocation of new sources of equipment to meet past commitments towards the old depositors and, as a result, a significant part of the supply of equipment was leaked out of the facilitating cycle and credit crunch emerged. The purpose of this study is to identify the most important factors affecting the occurrence of non-confirmatory financial banking behavior using data from 19 public and private banks of Iran. For this purpose, the causes of this non-confirmatory behavior of banks have been investigated using the panel vector autoregression method (PVAR) for the period of 2007-2015. Granger's causality test results suggest that the return of parallel markets for bank deposits, non-performing loans and the high share of the ratio of facilities to banks' deposits are all a cause of the formation of non-confirmatory behavior. Also, according to the results of impulse response functions and variance decomposition, NPL and the ratio of facilities to deposits have the highest long-term effect and also have a high contribution to explaining the changes in banks' non-confirmatory behavior in determining the interest rate on deposits.

Keywords: non-conformity behavior, Ponzi Game, panel vector autoregression, nonperforming loans

Procedia PDF Downloads 219
33943 Improving Flash Flood Forecasting with a Bayesian Probabilistic Approach: A Case Study on the Posina Basin in Italy

Authors: Zviad Ghadua, Biswa Bhattacharya

Abstract:

The Flash Flood Guidance (FFG) provides the rainfall amount of a given duration necessary to cause flooding. The approach is based on the development of rainfall-runoff curves, which helps us to find out the rainfall amount that would cause flooding. An alternative approach, mostly experimented with Italian Alpine catchments, is based on determining threshold discharges from past events and on finding whether or not an oncoming flood has its magnitude more than some critical discharge thresholds found beforehand. Both approaches suffer from large uncertainties in forecasting flash floods as, due to the simplistic approach followed, the same rainfall amount may or may not cause flooding. This uncertainty leads to the question whether a probabilistic model is preferable over a deterministic one in forecasting flash floods. We propose the use of a Bayesian probabilistic approach in flash flood forecasting. A prior probability of flooding is derived based on historical data. Additional information, such as antecedent moisture condition (AMC) and rainfall amount over any rainfall thresholds are used in computing the likelihood of observing these conditions given a flash flood has occurred. Finally, the posterior probability of flooding is computed using the prior probability and the likelihood. The variation of the computed posterior probability with rainfall amount and AMC presents the suitability of the approach in decision making in an uncertain environment. The methodology has been applied to the Posina basin in Italy. From the promising results obtained, we can conclude that the Bayesian approach in flash flood forecasting provides more realistic forecasting over the FFG.

Keywords: flash flood, Bayesian, flash flood guidance, FFG, forecasting, Posina

Procedia PDF Downloads 137
33942 OILU Tag: A Projective Invariant Fiducial System

Authors: Youssef Chahir, Messaoud Mostefai, Salah Khodja

Abstract:

This paper presents the development of a 2D visual marker, derived from a recent patented work in the field of numbering systems. The proposed fiducial uses a group of projective invariant straight-line patterns, easily detectable and remotely recognizable. Based on an efficient data coding scheme, the developed marker enables producing a large panel of unique real time identifiers with highly distinguishable patterns. The proposed marker Incorporates simultaneously decimal and binary information, making it readable by both humans and machines. This important feature opens up new opportunities for the development of efficient visual human-machine communication and monitoring protocols. Extensive experiment tests validate the robustness of the marker against acquisition and geometric distortions.

Keywords: visual markers, projective invariants, distance map, level sets

Procedia PDF Downloads 164
33941 High Resolution Satellite Imagery and Lidar Data for Object-Based Tree Species Classification in Quebec, Canada

Authors: Bilel Chalghaf, Mathieu Varin

Abstract:

Forest characterization in Quebec, Canada, is usually assessed based on photo-interpretation at the stand level. For species identification, this often results in a lack of precision. Very high spatial resolution imagery, such as DigitalGlobe, and Light Detection and Ranging (LiDAR), have the potential to overcome the limitations of aerial imagery. To date, few studies have used that data to map a large number of species at the tree level using machine learning techniques. The main objective of this study is to map 11 individual high tree species ( > 17m) at the tree level using an object-based approach in the broadleaf forest of Kenauk Nature, Quebec. For the individual tree crown segmentation, three canopy-height models (CHMs) from LiDAR data were assessed: 1) the original, 2) a filtered, and 3) a corrected model. The corrected CHM gave the best accuracy and was then coupled with imagery to refine tree species crown identification. When compared with photo-interpretation, 90% of the objects represented a single species. For modeling, 313 variables were derived from 16-band WorldView-3 imagery and LiDAR data, using radiance, reflectance, pixel, and object-based calculation techniques. Variable selection procedures were employed to reduce their number from 313 to 16, using only 11 bands to aid reproducibility. For classification, a global approach using all 11 species was compared to a semi-hierarchical hybrid classification approach at two levels: (1) tree type (broadleaf/conifer) and (2) individual broadleaf (five) and conifer (six) species. Five different model techniques were used: (1) support vector machine (SVM), (2) classification and regression tree (CART), (3) random forest (RF), (4) k-nearest neighbors (k-NN), and (5) linear discriminant analysis (LDA). Each model was tuned separately for all approaches and levels. For the global approach, the best model was the SVM using eight variables (overall accuracy (OA): 80%, Kappa: 0.77). With the semi-hierarchical hybrid approach, at the tree type level, the best model was the k-NN using six variables (OA: 100% and Kappa: 1.00). At the level of identifying broadleaf and conifer species, the best model was the SVM, with OA of 80% and 97% and Kappa values of 0.74 and 0.97, respectively, using seven variables for both models. This paper demonstrates that a hybrid classification approach gives better results and that using 16-band WorldView-3 with LiDAR data leads to more precise predictions for tree segmentation and classification, especially when the number of tree species is large.

Keywords: tree species, object-based, classification, multispectral, machine learning, WorldView-3, LiDAR

Procedia PDF Downloads 136
33940 Examining Statistical Monitoring Approach against Traditional Monitoring Techniques in Detecting Data Anomalies during Conduct of Clinical Trials

Authors: Sheikh Omar Sillah

Abstract:

Introduction: Monitoring is an important means of ensuring the smooth implementation and quality of clinical trials. For many years, traditional site monitoring approaches have been critical in detecting data errors but not optimal in identifying fabricated and implanted data as well as non-random data distributions that may significantly invalidate study results. The objective of this paper was to provide recommendations based on best statistical monitoring practices for detecting data-integrity issues suggestive of fabrication and implantation early in the study conduct to allow implementation of meaningful corrective and preventive actions. Methodology: Electronic bibliographic databases (Medline, Embase, PubMed, Scopus, and Web of Science) were used for the literature search, and both qualitative and quantitative studies were sought. Search results were uploaded into Eppi-Reviewer Software, and only publications written in the English language from 2012 were included in the review. Gray literature not considered to present reproducible methods was excluded. Results: A total of 18 peer-reviewed publications were included in the review. The publications demonstrated that traditional site monitoring techniques are not efficient in detecting data anomalies. By specifying project-specific parameters such as laboratory reference range values, visit schedules, etc., with appropriate interactive data monitoring, statistical monitoring can offer early signals of data anomalies to study teams. The review further revealed that statistical monitoring is useful to identify unusual data patterns that might be revealing issues that could impact data integrity or may potentially impact study participants' safety. However, subjective measures may not be good candidates for statistical monitoring. Conclusion: The statistical monitoring approach requires a combination of education, training, and experience sufficient to implement its principles in detecting data anomalies for the statistical aspects of a clinical trial.

Keywords: statistical monitoring, data anomalies, clinical trials, traditional monitoring

Procedia PDF Downloads 79
33939 GPS Refinement in Cities Using Statistical Approach

Authors: Ashwani Kumar

Abstract:

GPS plays an important role in everyday life for safe and convenient transportation. While pedestrians use hand held devices to know their position in a city, vehicles in intelligent transport systems use relatively sophisticated GPS receivers for estimating their current position. However, in urban areas where the GPS satellites are occluded by tall buildings, trees and reflections of GPS signals from nearby vehicles, GPS position estimation becomes poor. In this work, an exhaustive GPS data is collected at a single point in urban area under different times of day and under dynamic environmental conditions. The data is analyzed and statistical refinement methods are used to obtain optimal position estimate among all the measured positions. The results obtained are compared with publically available datasets and obtained position estimation refinement results are promising.

Keywords: global positioning system, statistical approach, intelligent transport systems, least squares estimation

Procedia PDF Downloads 288
33938 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products

Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry

Abstract:

The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.

Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively

Procedia PDF Downloads 90
33937 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests

Authors: Julius Onyancha, Valentina Plekhanova

Abstract:

One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.

Keywords: web log data, web user profile, user interest, noise web data learning, machine learning

Procedia PDF Downloads 265
33936 From Data Processing to Experimental Design and Back Again: A Parameter Identification Problem Based on FRAP Images

Authors: Stepan Papacek, Jiri Jablonsky, Radek Kana, Ctirad Matonoha, Stefan Kindermann

Abstract:

FRAP (Fluorescence Recovery After Photobleaching) is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data processing part is still under development. In this paper, we formulate and solve the problem of data selection which enhances the processing of FRAP images. We introduce the concept of the irrelevant data set, i.e., the data which are almost not reducing the confidence interval of the estimated parameters and thus could be neglected. Based on sensitivity analysis, we both solve the problem of the optimal data space selection and we find specific conditions for optimizing an important experimental design factor, e.g., the radius of bleach spot. Finally, a theorem announcing less precision of the integrated data approach compared to the full data case is proven; i.e., we claim that the data set represented by the FRAP recovery curve lead to a larger confidence interval compared to the spatio-temporal (full) data.

Keywords: FRAP, inverse problem, parameter identification, sensitivity analysis, optimal experimental design

Procedia PDF Downloads 278
33935 A Neural Network Approach to Evaluate Supplier Efficiency in a Supply Chain

Authors: Kishore K. Pochampally

Abstract:

The success of a supply chain heavily relies on the efficiency of the suppliers involved. In this paper, we propose a neural network approach to evaluate the efficiency of a supplier, which is being considered for inclusion in a supply chain, using the available linguistic (fuzzy) data of suppliers that already exist in the supply chain. The approach is carried out in three phases, as follows: In phase one, we identify criteria for evaluation of the supplier of interest. Then, in phase two, we use performance measures of already existing suppliers to construct a neural network that gives weights (importance values) of criteria identified in phase one. Finally, in phase three, we calculate the overall rating of the supplier of interest. The following are the major findings of the research conducted for this paper: (i) linguistic (fuzzy) ratings of suppliers such as 'good', 'bad', etc., can be converted (defuzzified) to numerical ratings (1 – 10 scale) using fuzzy logic so that those ratings can be used for further quantitative analysis; (ii) it is possible to construct and train a multi-level neural network in order to determine the weights of the criteria that are used to evaluate a supplier; and (iii) Borda’s rule can be used to group the weighted ratings and calculate the overall efficiency of the supplier.

Keywords: fuzzy data, neural network, supplier, supply chain

Procedia PDF Downloads 114
33934 A Time-Reducible Approach to Compute Determinant |I-X|

Authors: Wang Xingbo

Abstract:

Computation of determinant in the form |I-X| is primary and fundamental because it can help to compute many other determinants. This article puts forward a time-reducible approach to compute determinant |I-X|. The approach is derived from the Newton’s identity and its time complexity is no more than that to compute the eigenvalues of the square matrix X. Mathematical deductions and numerical example are presented in detail for the approach. By comparison with classical approaches the new approach is proved to be superior to the classical ones and it can naturally reduce the computational time with the improvement of efficiency to compute eigenvalues of the square matrix.

Keywords: algorithm, determinant, computation, eigenvalue, time complexity

Procedia PDF Downloads 415
33933 Modified Naive Bayes-Based Prediction Modeling for Crop Yield Prediction

Authors: Kefaya Qaddoum

Abstract:

Most of greenhouse growers desire a determined amount of yields in order to accurately meet market requirements. The purpose of this paper is to model a simple but often satisfactory supervised classification method. The original naive Bayes have a serious weakness, which is producing redundant predictors. In this paper, utilized regularization technique was used to obtain a computationally efficient classifier based on naive Bayes. The suggested construction, utilized L1-penalty, is capable of clearing redundant predictors, where a modification of the LARS algorithm is devised to solve this problem, making this method applicable to a wide range of data. In the experimental section, a study conducted to examine the effect of redundant and irrelevant predictors, and test the method on WSG data set for tomato yields, where there are many more predictors than data, and the urge need to predict weekly yield is the goal of this approach. Finally, the modified approach is compared with several naive Bayes variants and other classification algorithms (SVM and kNN), and is shown to be fairly good.

Keywords: tomato yield prediction, naive Bayes, redundancy, WSG

Procedia PDF Downloads 237
33932 Compartmental Model Approach for Dosimetric Calculations of ¹⁷⁷Lu-DOTATOC in Adenocarcinoma Breast Cancer Based on Animal Data

Authors: M. S. Mousavi-Daramoroudi, H. Yousefnia, S. Zolghadri, F. Abbasi-Davani

Abstract:

Dosimetry is an indispensable and precious factor in patient treatment planning; to minimize the absorbed dose in vital tissues. In this study, In accordance with the proper characteristics of DOTATOC and ¹⁷⁷Lu, after preparing ¹⁷⁷Lu-DOTATOC at the optimal conditions for the first time in Iran, radionuclidic and radiochemical purity of the solution was investigated using an HPGe spectrometer and ITLC method, respectively. The biodistribution of the compound was assayed for treatment of adenocarcinoma breast cancer in bearing BALB/c mice. The results have demonstrated that ¹⁷⁷Lu-DOTATOC is a profitable selection for therapy of the tumors. Because of the vital role of internal dosimetry before and during therapy, the effort to improve the accuracy and rapidity of dosimetric calculations is necessary. For this reason, a new method was accomplished to calculate the absorbed dose through mixing between compartmental model, animal dosimetry and extrapolated data from animal to human and using MIRD method. Despite utilization of compartmental model based on the experimental data, it seems this approach may increase the accuracy of dosimetric data, confidently.

Keywords: ¹⁷⁷Lu-DOTATOC, biodistribution modeling, compartmental model, internal dosimetry

Procedia PDF Downloads 221
33931 Synthesis and Anti-Cancer Evaluation of Uranyle Complexes

Authors: Abdol-Hassan Doulah

Abstract:

In this research, some of the inorganic complexes of uranyl with N- donor ligands were synthesized. Complexes were characteriezed by FT-IR and UV spectra, ¹HNMR, ¹³CNMR and some physical properties. The uranyl unit (UO2) is composed of a center of uranium atom with the charge (+6) and two oxygen atom by forming two U=O double bonds. The structure is linear (O=U=O, 180) and usually stable. So other ligands often coordinate to the U atom in the plane perpendicularly to the O=U=O axis. The antitumor activity of some of ligand and their complexes against a panel of human tumor cell lines (HT29: Haman colon adenocarcinoma cell line T47D: human breast adenocarcinoma cell line) were determined by MTT(3-[4,5-dimethylthiazol-2-yl]-2,5-diphenyl-tetrazolium bromide) assay. These data suggest that some of these compounds provide good models for the further design of potent antitumor compounds.

Keywords: inorganic, uranyl complex-donor ligands, Schiff bases, anticancer activity

Procedia PDF Downloads 454
33930 Thick Data Techniques for Identifying Abnormality in Video Frames for Wireless Capsule Endoscopy

Authors: Jinan Fiaidhi, Sabah Mohammed, Petros Zezos

Abstract:

Capsule endoscopy (CE) is an established noninvasive diagnostic modality in investigating small bowel disease. CE has a pivotal role in assessing patients with suspected bleeding or identifying evidence of active Crohn's disease in the small bowel. However, CE produces lengthy videos with at least eighty thousand frames, with a frequency rate of 2 frames per second. Gastroenterologists cannot dedicate 8 to 15 hours to reading the CE video frames to arrive at a diagnosis. This is why the issue of analyzing CE videos based on modern artificial intelligence techniques becomes a necessity. However, machine learning, including deep learning, has failed to report robust results because of the lack of large samples to train its neural nets. In this paper, we are describing a thick data approach that learns from a few anchor images. We are using sound datasets like KVASIR and CrohnIPI to filter candidate frames that include interesting anomalies in any CE video. We are identifying candidate frames based on feature extraction to provide representative measures of the anomaly, like the size of the anomaly and the color contrast compared to the image background, and later feed these features to a decision tree that can classify the candidate frames as having a condition like the Crohn's Disease. Our thick data approach reported accuracy of detecting Crohn's Disease based on the availability of ulcer areas at the candidate frames for KVASIR was 89.9% and for the CrohnIPI was 83.3%. We are continuing our research to fine-tune our approach by adding more thick data methods for enhancing diagnosis accuracy.

Keywords: thick data analytics, capsule endoscopy, Crohn’s disease, siamese neural network, decision tree

Procedia PDF Downloads 156