Search results for: validation indexes
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1580

Search results for: validation indexes

1580 Automatic Moment-Based Texture Segmentation

Authors: Tudor Barbu

Abstract:

An automatic moment-based texture segmentation approach is proposed in this paper. First, we describe the related work in this computer vision domain. Our texture feature extraction, the first part of the texture recognition process, produces a set of moment-based feature vectors. For each image pixel, a texture feature vector is computed as a sequence of area moments. Second, an automatic pixel classification approach is proposed. The feature vectors are clustered using some unsupervised classification algorithm, the optimal number of clusters being determined using a measure based on validation indexes. From the resulted pixel classes one determines easily the desired texture regions of the image.

Keywords: image segmentation, moment-based, texture analysis, automatic classification, validation indexes

Procedia PDF Downloads 392
1579 Morphotectonic Analysis of Burkh Anticline, North of Bastak, Zagros

Authors: A. Afroogh, R. Ramazani omali, N. Hafezi Moghaddas, A. Nohegar

Abstract:

The Burkh anticline with a length of 50 km and a width of 9 km is located 40 km to the north of Bastak in internal Fars zone in folded-trusted belt of Zagros. In order to assess the active tectonics in the area of study, morphometrical indexes such as V indexes (V), ratio of valley floor to valley width (Vf), the stream length-gradient ratio (Sl), channel sinuosity indexes (S), mountain front faceting indexes (F%) and mountain front sinuosity(Smf) have been studied. These investigations show that the activity is not equal in various sections of the length of Burkh anticline. The central part of this anticline is the most active one.

Keywords: anticline, internal fars zone, tectonic, morohometrical indexes, folded-trusted belt

Procedia PDF Downloads 228
1578 Correlation between Seismic Risk Insurance Indexes and Uninhabitability Indexes of Buildings in Morocco

Authors: Nabil Mekaoui, Nacer Jabour, Abdelhamid Allaoui, Abderahim Oulidi

Abstract:

The reliability of several insurance indexes of the seismic risk is evaluated and compared for an efficient seismic risk coverage of buildings in Morocco, thus, reducing the basic risk. A large database of earthquake ground motions is established from recent seismic events in Morocco and synthetic ground motions compatible with the design spectrum in order to conduct nonlinear time history analyses on three building models representative of the building stock in Morocco. The uninhabitability index is evaluated based on the simulated damage index, then correlated with preselected insurance indexes. Interestingly, the commonly used peak ground acceleration index showed poor correlation when compared with other indexes, such as spectral accelerations at low periods. Recommendations on the choice of suitable insurance indexes are formulated for efficient seismic risk coverage in Morocco.

Keywords: catastrophe modeling, damage, earthquake, reinsurance, seismic hazard, trigger index, vulnerability

Procedia PDF Downloads 55
1577 Does Level of Countries Corruption Affect Firms Working Capital Management?

Authors: Ebrahim Mansoori, Datin Joriah Muhammad

Abstract:

Recent studies in finance have focused on the effect of external variables on working capital management. This study investigates the effect of corruption indexes on firms' working capital management. A large data set that covers data from 2005 to 2013 from five ASEAN countries, namely, Malaysia, Indonesia, Singapore, Thailand, and the Philippines, was selected to investigate how the level of corruption in these countries affect working capital management. The results of panel data analysis include fixed effect estimations showed that a high level of countries' corruption indexes encourages managers to shorten the CCC length. Meanwhile, the managers reduce the level of investment in cash and cash equivalents when the levels of corruption indexes increase. Therefore, increasing the level of countries' corruption indexes encourages managers to select conservative working capital strategies by reducing the level of NLB.

Keywords: ASEAN, corruption indexes, panel data analysis, working capital management

Procedia PDF Downloads 412
1576 Determination of Relationship among Shape Indexes Used for Land Consolidation

Authors: Firat Arslan, Hasan Degirmenci, Serife Tulin Akkaya Aslan

Abstract:

The aim of the current experiment was to determine the relationship among shape indexes which are used by the researchers in many fields to evaluate parcel shapes which is very important for farming even if these indexes are controversial. In the current study, land consolidation project of Halitaga village in Mersin province in Turkey which has 278 parcel and cover 894.4 ha, was taken as a material. Commonly used indicators such as fractal dimension (FD), shape index (SI), form factor (FORM), areal form factor (AFF) and two distinct area-perimeter ratio (APR-1 and APR2) in land consolidation are used to measure agricultural plot’s shape. FD was positively correlated with SI, APR-1 and APR-2 whereas it was negatively correlated with FORM and AFF. SI was positively correlated with APR-1 and APR-2 whereas it was negatively correlated with FORM and AFF. As a conclusion, it is likely that these indexes involved may be used interchangeably due to high correlations among them.

Keywords: GIS, land consolidation, parcel shape, shape index

Procedia PDF Downloads 163
1575 A Validation Technique for Integrated Ontologies

Authors: Neli P. Zlatareva

Abstract:

Ontology validation is an important part of web applications’ development, where knowledge integration and ontological reasoning play a fundamental role. It aims to ensure the consistency and correctness of ontological knowledge and to guarantee that ontological reasoning is carried out in a meaningful way. Existing approaches to ontology validation address more or less specific validation issues, but the overall process of validating web ontologies has not been formally established yet. As the size and the number of web ontologies continue to grow, the necessity to validate and ensure their consistency and interoperability is becoming increasingly important. This paper presents a validation technique intended to test the consistency of independent ontologies utilized by a common application.

Keywords: knowledge engineering, ontological reasoning, ontology validation, semantic web

Procedia PDF Downloads 301
1574 Development of a Predictive Model to Prevent Financial Crisis

Authors: Tengqin Han

Abstract:

Delinquency has been a crucial factor in economics throughout the years. Commonly seen in credit card and mortgage, it played one of the crucial roles in causing the most recent financial crisis in 2008. In each case, a delinquency is a sign of the loaner being unable to pay off the debt, and thus may cause a lost of property in the end. Individually, one case of delinquency seems unimportant compared to the entire credit system. China, as an emerging economic entity, the national strength and economic strength has grown rapidly, and the gross domestic product (GDP) growth rate has remained as high as 8% in the past decades. However, potential risks exist behind the appearance of prosperity. Among the risks, the credit system is the most significant one. Due to long term and a large amount of balance of the mortgage, it is critical to monitor the risk during the performance period. In this project, about 300,000 mortgage account data are analyzed in order to develop a predictive model to predict the probability of delinquency. Through univariate analysis, the data is cleaned up, and through bivariate analysis, the variables with strong predictive power are detected. The project is divided into two parts. In the first part, the analysis data of 2005 are split into 2 parts, 60% for model development, and 40% for in-time model validation. The KS of model development is 31, and the KS for in-time validation is 31, indicating the model is stable. In addition, the model is further validation by out-of-time validation, which uses 40% of 2006 data, and KS is 33. This indicates the model is still stable and robust. In the second part, the model is improved by the addition of macroeconomic economic indexes, including GDP, consumer price index, unemployment rate, inflation rate, etc. The data of 2005 to 2010 is used for model development and validation. Compared with the base model (without microeconomic variables), KS is increased from 41 to 44, indicating that the macroeconomic variables can be used to improve the separation power of the model, and make the prediction more accurate.

Keywords: delinquency, mortgage, model development, model validation

Procedia PDF Downloads 206
1573 Invention of Novel Technique of Process Scale Up by Using Solid Dosage Form

Authors: Shashank Tiwari, S. P. Mahapatra

Abstract:

The aim of this technique is to reduce the steps of process scales up, save time & cost of the industries. This technique will minimise the steps of process scale up. The new steps are, Novel Lab Scale, Novel Lab Scale Trials, Novel Trial Batches, Novel Exhibit Batches, Novel Validation Batches. In these steps, it is not divided to validation batches in three parts but the data of trials batches, Exhibit Batches and Validation batches are use and compile for production and used for validation. It also increases the batch size of the trial, exhibit batches. The new size of trials batches is not less than fifty Thousand, the exhibit batches increase up to two lack and the validation batches up to five lack. After preparing the batches all their data & drugs use for stability & maintain the validation record and compile data for the technology transfer in production department for preparing the marketed size batches.

Keywords: batches, technique, preparation, scale up, validation

Procedia PDF Downloads 333
1572 The Assessment of the Comparative Efficiency of Reforms through the Integral Index of Transformation

Authors: Samson Davoyan, Ashot Davoyan, Ani Khachatryan

Abstract:

The indexes (Global Competitiveness Index, Economic Freedom Index, Human Development Index, etc.) developed by different international and non-government organizations in time and space express the quantitative and qualitative features of different fields of various reforms implemented in different countries. The main objective of our research is to develop new methodology that we will use to create integral index based on many indexes and that will include many areas of reforms. To achieve our aim we have used econometric methods (regression model for panel data method). The basis of our methodology is the development of the new integral index based on quantitative assessment of the change of two main parameters: the score of the countries by different indexes and the change of the ranks of countries for following two periods of time. As a result of the usage of methods for analyzes we have defined the indexes that are used to create the new integral index and the scales for each of them. Analyzing quantitatively and qualitatively analysis through the integral index for more than 100 countries for 2009-2014, we have defined comparative efficiency that helps to conclude in which directions countries have implemented reforms more effectively compared to others and in which direction reforms have implemented less efficiently.

Keywords: development, rank, reforms, comparative, index, economic, corruption, social, program

Procedia PDF Downloads 306
1571 Efficient Model Selection in Linear and Non-Linear Quantile Regression by Cross-Validation

Authors: Yoonsuh Jung, Steven N. MacEachern

Abstract:

Check loss function is used to define quantile regression. In the prospect of cross validation, it is also employed as a validation function when underlying truth is unknown. However, our empirical study indicates that the validation with check loss often leads to choosing an over estimated fits. In this work, we suggest a modified or L2-adjusted check loss which rounds the sharp corner in the middle of check loss. It has a large effect of guarding against over fitted model in some extent. Through various simulation settings of linear and non-linear regressions, the improvement of check loss by L2 adjustment is empirically examined. This adjustment is devised to shrink to zero as sample size grows.

Keywords: cross-validation, model selection, quantile regression, tuning parameter selection

Procedia PDF Downloads 413
1570 Modeling of Sediment Yield and Streamflow of Watershed Basin in the Philippines Using the Soil Water Assessment Tool Model for Watershed Sustainability

Authors: Warda L. Panondi, Norihiro Izumi

Abstract:

Sedimentation is a significant threat to the sustainability of reservoirs and their watershed. In the Philippines, the Pulangi watershed experienced a high sediment loss mainly due to land conversions and plantations that showed critical erosion rates beyond the tolerable limit of -10 ton/ha/yr in all of its sub-basin. From this event, the prediction of runoff volume and sediment yield is essential to examine using the country's soil conservation techniques realistically. In this research, the Pulangi watershed was modeled using the soil water assessment tool (SWAT) to predict its watershed basin's annual runoff and sediment yield. For the calibration and validation of the model, the SWAT-CUP was utilized. The model was calibrated with monthly discharge data for 1990-1993 and validated for 1994-1997. Simultaneously, the sediment yield was calibrated in 2014 and validated in 2015 because of limited observed datasets. Uncertainty analysis and calculation of efficiency indexes were accomplished through the SUFI-2 algorithm. According to the coefficient of determination (R2), Nash Sutcliffe efficiency (NSE), King-Gupta efficiency (KGE), and PBIAS, the calculation of streamflow indicates a good performance for both calibration and validation periods while the sediment yield resulted in a satisfactory performance for both calibration and validation. Therefore, this study was able to identify the most critical sub-basin and severe needs of soil conservation. Furthermore, this study will provide baseline information to prevent floods and landslides and serve as a useful reference for land-use policies and watershed management and sustainability in the Pulangi watershed.

Keywords: Pulangi watershed, sediment yield, streamflow, SWAT model

Procedia PDF Downloads 180
1569 Consolidation Behavior of Lebanese Soil and Its Correlation with the Soil Parameters

Authors: Robert G. Nini

Abstract:

Soil consolidation is one of the biggest problem facing engineers. The consolidation process has an important role in settlement analysis for the embankments and footings resting on clayey soils. The settlement amount is related to the compression and the swelling indexes of the soil. Because the predominant upper soil layer in Lebanon is consisting mainly of clay, this layer is a real challenge for structural and highway engineering. To determine the effect of load and drainage on the engineering consolidation characteristics of Lebanese soil, a full experimental and synthesis study was conducted on different soil samples collected from many locations. This study consists of two parts. During the first part which is an experimental one, the Proctor test and the consolidation test were performed on the collected soil samples. After it, the identifications soil tests as hydrometer, specific gravity and Atterberg limits are done. The consolidation test which is the main test in this research is done by loading the soil for some days then an unloading cycle was applied. It takes two weeks to complete a typical consolidation test. Because of these reasons, during the second part of our research which is based on the analysis of the experiments results, some correlations were found between the main consolidation parameters as compression and swelling indexes with the other soil parameters easy to calculate. The results show that the compression and swelling indexes of Lebanese clays may be roughly estimated using a model involving one or two variables in the form of the natural void ratio and the Atterberg limits. These correlations have increasing importance for site engineers, and the proposed model also seems to be applicable to a wide range of clays worldwide.

Keywords: atterberg limits, clay, compression and swelling indexes, settlement, soil consolidation

Procedia PDF Downloads 113
1568 Moving towards a General Definition of Public Happiness: A Grounded Theory Approach to the Recent Academic Research on Well-Being

Authors: Cristina Sanchez-Sanchez

Abstract:

Although there seems to be a growing interest in the study of the citizen’s happiness as an alternative measure of a country’s progress to GDP, happiness as a public concern is still an ambiguous concept, hard to define. Moreover, different notions are used indiscriminately to talk about the same thing. This investigation aims to determine the conceptions of happiness, well-being and quality of life that originate from the indexes that different governments and public institutions around the world have created to study them. Through the Scoping Review method, this study identifies the recent academic research in this field (a total of 267 documents between 2006 and 2016) from some of the most popular social sciences databases around the world, Web of Science, Scopus, JSTOR, Sage, EBSCO, IBSS and Google Scholar, and in Spain, ISOC and Dialnet. These 267 documents referenced 53 different indexes and researches. The Grounded Theory method has been applied to a sample of 13 indexes in order to identify the main categories they use to determine these three concepts. The results show that these are multi-dimensional concepts and similar indicators are used indistinctly to measure happiness, well-being and quality of life.

Keywords: common good, grounded theory, happiness economics, happiness index, quality of life, scoping review, well-being

Procedia PDF Downloads 257
1567 Preliminary Study of Standardization and Validation of Micronuclei Technique to Assess the DNA Damages Cause for the X-Rays

Authors: L. J. Díaz, M. A. Hernández, A. K. Molina, A. Bermúdez, C. Crane, V. M. Pabón

Abstract:

One of the most important biological indicators that show the exposure to the radiation is the micronuclei (MN). This technique is using to determinate the radiation effects in blood cultures as a biological control and a complement to the physics dosimetry. In Colombia the necessity to apply this analysis has emerged due to the current biological indicator most used is the chromosomal aberrations (CA), that is why it is essential the MN technique’s standardization and validation to have enough tools to improve the radioprotection topic in the country. Besides, this technique will be applied on the construction of a dose-response curve, that allow measure an approximately dose to irradiated people according to MN frequency found. Inside the steps that carried out to accomplish the standardization and validation is the statistic analysis from the lectures of “in vitro” peripheral blood cultures with different analysts, also it was determinate the best culture medium and conditions for the MN can be detected easily.

Keywords: micronuclei, radioprotection, standardization, validation

Procedia PDF Downloads 474
1566 Experimental Verification of the Relationship between Physiological Indexes and the Presence or Absence of an Operation during E-learning

Authors: Masaki Omata, Shumma Hosokawa

Abstract:

An experiment to verify the relationships between physiological indexes of an e-learner and the presence or absence of an operation during e-learning is described. Electroencephalogram (EEG), hemoencephalography (HEG), skin conductance (SC), and blood volume pulse (BVP) values were measured while participants performed experimental learning tasks. The results show that there are significant differences between the SC values when reading with clicking on learning materials and the SC values when reading without clicking, and between the HEG ratio when reading (with and without clicking) and the HEG ratio when resting for four of five participants. We conclude that the SC signals can be used to estimate whether or not a learner is performing an active task and that the HEG ratios can be used to estimate whether a learner is learning.

Keywords: e-learning, physiological index, physiological signal, state of learning

Procedia PDF Downloads 362
1565 Image Processing and Calculation of NGRDI Embedded System in Raspberry

Authors: Efren Lopez Jimenez, Maria Isabel Cajero, J. Irving-Vasqueza

Abstract:

The use and processing of digital images have opened up new opportunities for the resolution of problems of various kinds, such as the calculation of different vegetation indexes, among other things, differentiating healthy vegetation from humid vegetation. However, obtaining images from which these indexes are calculated is still the exclusive subject of active research. In the present work, we propose to obtain these images using a low cost embedded system (Raspberry Pi) and its processing, using a set of libraries of open code called OpenCV, in order to obtain the Normalized Red-Green Difference Index (NGRDI).

Keywords: Raspberry Pi, vegetation index, Normalized Red-Green Difference Index (NGRDI), OpenCV

Procedia PDF Downloads 262
1564 Characterization of 3D-MRP for Analyzing of Brain Balancing Index (BBI) Pattern

Authors: N. Fuad, M. N. Taib, R. Jailani, M. E. Marwan

Abstract:

This paper discusses on power spectral density (PSD) characteristics which are extracted from three-dimensional (3D) electroencephalogram (EEG) models. The EEG signal recording was conducted on 150 healthy subjects. Development of 3D EEG models involves pre-processing of raw EEG signals and construction of spectrogram images. Then, the values of maximum PSD were extracted as features from the model. These features are analysed using mean relative power (MRP) and different mean relative power (DMRP) technique to observe the pattern among different brain balancing indexes. The results showed that by implementing these techniques, the pattern of brain balancing indexes can be clearly observed. Some patterns are indicates between index 1 to index 5 for left frontal (LF) and right frontal (RF).

Keywords: power spectral density, 3D EEG model, brain balancing, mean relative power, different mean relative power

Procedia PDF Downloads 451
1563 Analysis of Expression Data Using Unsupervised Techniques

Authors: M. A. I Perera, C. R. Wijesinghe, A. R. Weerasinghe

Abstract:

his study was conducted to review and identify the unsupervised techniques that can be employed to analyze gene expression data in order to identify better subtypes of tumors. Identifying subtypes of cancer help in improving the efficacy and reducing the toxicity of the treatments by identifying clues to find target therapeutics. Process of gene expression data analysis described under three steps as preprocessing, clustering, and cluster validation. Feature selection is important since the genomic data are high dimensional with a large number of features compared to samples. Hierarchical clustering and K Means are often used in the analysis of gene expression data. There are several cluster validation techniques used in validating the clusters. Heatmaps are an effective external validation method that allows comparing the identified classes with clinical variables and visual analysis of the classes.

Keywords: cancer subtypes, gene expression data analysis, clustering, cluster validation

Procedia PDF Downloads 127
1562 Comparative Study of Urban Structure between an Island-Type and a General-Type City

Authors: Tomoya Oshiro, Hiroko Ono

Abstract:

Japan's aging population is increasing due to the decrease in birthrate. It causes various problems like the decrease in the gross domestic product of the country. The reason is why the local government of Japan has been on the way to a sustainable city recently. Then it is essential to get control of an urban structure to make the compact city successful. There are many kinds of paper about the compact city; however, the paper about a compact city of the island-type city is less. The purpose of this study is to clarify difference of urban structure between an island-type and a general city type. The method which has conducted in this research has two steps. First of all, by using evaluation indexes in the handbook, we evaluated the urban structures among each same -population-class cities from 50,000 to 100,000 people. Next, to clear the difference about the urban structure and feature between island-type and general-type cities compare the radar chart which is composed with each evaluation indexes of urban structure. Moreover, in order to clarify the relationship between evaluation indexes and the place of residence by using GIS software to show up population density on the map. As a result of this research, the management of local government and the local economy in evaluation indexes are indicated to be negative point in comparison of island-type cities with general cities. However, evaluation indexes of safety/security and low-carbon/energy are proved to be positive point. The research to find the difference features of the island-type of urban structure proves that the management of local government or the local economy is negative point in these island-type cities. In addition, the public transportation coverage in Miyako Island, Sado Island, and Amakusa Island show low value compare with other islands and average value. Relationship between evaluation indexes of an urban structure and the place of residence prove that the place of residence is related to public transportation coverage. If the place of residence is spread out, the public transportation coverage will be decreased. The results of this research reveal that the finances in island-type cities are negative point compare to general cities. This problem is caused by declining population. In addition, the place of residence is related to the public transportation coverage. Even though, it needs a much money to increase the public transportation coverage. It is possibly to cause other problems furthermore the aspect of finance is influenced by that as well. The conclusion in this research suggests that it is important for creating the compact city in island-type cities that we first need to address solving the problems about the management of local government and the local economy.

Keywords: sustainable city, comparative analysis, geographic information system, urban structure

Procedia PDF Downloads 126
1561 Determine of Design Variables and Target Reliability Indexes of Underground Structure

Authors: Yo-Seph Byun, Gyu-Phil Lee, Young-Bin Park, Gye-Chun Cho, Seong-Won Lee

Abstract:

In Korea, a study on Limit State Design (LSD) for underground structures is being conducted in order to perform more effective design. In this study, as a result of MCS (Monte-Carlo Simulation) technique, failure probabilities of the structure during normal and earthquake are estimated in reliability analysis. Target reliability indexes are determined depending on load combinations for underground structure, and then, design variables such as load and material factors in LSD are decided. As a result, through the research in order to determine more reliable design variables, a specification of LSD for underground structures is able to be developed.

Keywords: design variable, limit state design, target reliability index, underground structure

Procedia PDF Downloads 260
1560 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)

Authors: Gule Teri

Abstract:

The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.

Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing

Procedia PDF Downloads 51
1559 Effect of Sustainability Accounting Disclosure on Financial Performance of Listed Brewery Firms in Nigeria

Authors: Patricia Chinyere Oranefo

Abstract:

This study examined the effect of sustainability accounting disclosure on financial performance of listed Brewery firms in Nigeria. The dearth of empirical evidence and literature on “governance disclosure” as one of the explanatory variables of sustainability accounting reporting were the major motivation for this study. The main objective was to ascertain the effect of sustainability accounting disclosure on financial performance of listed Brewery firms in Nigeria. An ex–post facto research design approach was adopted for the study. The population of this study comprises of five (5) Brewery firms quoted on the floor of the Nigeria exchange group (NSX) and the sample size of four (4) listed firms was drawn using purposive sampling method. Secondary data were carefully sourced from the financial statement/annual reports and sustainability reports from 2012 to 2021 of the Brewery firms quoted on the Nigeria exchange group (NSX). Panel regression analysis by aid of E-views 10.0 software was used to test for statistical significance of the effect of sustainability accounting disclosure on financial performance of listed Brewery firms in Nigeria. The results showed that economic sustainability disclosure indexes do not significantly affect return on asset of listed Brewery firms in Nigeria. The findings further revealed that environmental sustainability disclosure indexes do not significantly affect return on equity of listed Brewery firms in Nigeria. More so, results showed that Social Sustainability disclosure indexes significantly affect Net Profit Margin of listed Brewery firms in Nigeria. Finally, the result established also that governance sustainability disclosure indexes do not significantly affect Earnings per share of listed Brewery firms in Nigeria. Consequent upon the findings, this study recommended among others; that managers of Brewers in Nigeria should improve and sustain full disclosure practices on economic, environmental, social and governance disclosures following the guidelines of the Global Reporting Index (GRI) as they are capable of exerting significant effect on financial performance of firms in Nigeria.

Keywords: sustainability, accounting, disclosure, financial performance

Procedia PDF Downloads 36
1558 The Detection of Implanted Radioactive Seeds on Ultrasound Images Using Convolution Neural Networks

Authors: Edward Holupka, John Rossman, Tye Morancy, Joseph Aronovitz, Irving Kaplan

Abstract:

A common modality for the treatment of early stage prostate cancer is the implantation of radioactive seeds directly into the prostate. The radioactive seeds are positioned inside the prostate to achieve optimal radiation dose coverage to the prostate. These radioactive seeds are positioned inside the prostate using Transrectal ultrasound imaging. Once all of the planned seeds have been implanted, two dimensional transaxial transrectal ultrasound images separated by 2 mm are obtained through out the prostate, beginning at the base of the prostate up to and including the apex. A common deep neural network, called DetectNet was trained to automatically determine the position of the implanted radioactive seeds within the prostate under ultrasound imaging. The results of the training using 950 training ultrasound images and 90 validation ultrasound images. The commonly used metrics for successful training were used to evaluate the efficacy and accuracy of the trained deep neural network and resulted in an loss_bbox (train) = 0.00, loss_coverage (train) = 1.89e-8, loss_bbox (validation) = 11.84, loss_coverage (validation) = 9.70, mAP (validation) = 66.87%, precision (validation) = 81.07%, and a recall (validation) = 82.29%, where train and validation refers to the training image set and validation refers to the validation training set. On the hardware platform used, the training expended 12.8 seconds per epoch. The network was trained for over 10,000 epochs. In addition, the seed locations as determined by the Deep Neural Network were compared to the seed locations as determined by a commercial software based on a one to three months after implant CT. The Deep Learning approach was within \strikeout off\uuline off\uwave off2.29\uuline default\uwave default mm of the seed locations determined by the commercial software. The Deep Learning approach to the determination of radioactive seed locations is robust, accurate, and fast and well within spatial agreement with the gold standard of CT determined seed coordinates.

Keywords: prostate, deep neural network, seed implant, ultrasound

Procedia PDF Downloads 172
1557 Compressed Suffix Arrays to Self-Indexes Based on Partitioned Elias-Fano

Authors: Guo Wenyu, Qu Youli

Abstract:

A practical and simple self-indexing data structure, Partitioned Elias-Fano (PEF) - Compressed Suffix Arrays (CSA), is built in linear time for the CSA based on PEF indexes. Moreover, the PEF-CSA is compared with two classical compressed indexing methods, Ferragina and Manzini implementation (FMI) and Sad-CSA on different type and size files in Pizza & Chili. The PEF-CSA performs better on the existing data in terms of the compression ratio, count, and locates time except for the evenly distributed data such as proteins data. The observations of the experiments are that the distribution of the φ is more important than the alphabet size on the compression ratio. Unevenly distributed data φ makes better compression effect, and the larger the size of the hit counts, the longer the count and locate time.

Keywords: compressed suffix array, self-indexing, partitioned Elias-Fano, PEF-CSA

Procedia PDF Downloads 232
1556 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator

Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty

Abstract:

Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) wherein the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation and Control design team. This paper discusses the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.

Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), steady state, transient state

Procedia PDF Downloads 235
1555 A Proposal for Systematic Mapping Study of Software Security Testing, Verification and Validation

Authors: Adriano Bessa Albuquerque, Francisco Jose Barreto Nunes

Abstract:

Software vulnerabilities are increasing and not only impact services and processes availability as well as information confidentiality, integrity and privacy, but also cause changes that interfere in the development process. Security test could be a solution to reduce vulnerabilities. However, the variety of test techniques with the lack of real case studies of applying tests focusing on software development life cycle compromise its effective use. This paper offers an overview of how a Systematic Mapping Study (MS) about security verification, validation and test (VVT) was performed, besides presenting general results about this study.

Keywords: software test, software security verification validation and test, security test institutionalization, systematic mapping study

Procedia PDF Downloads 378
1554 Modeling Sediment Yield Using the SWAT Model: A Case Study of Upper Ankara River Basin, Turkey

Authors: Umit Duru

Abstract:

The Soil and Water Assessment Tool (SWAT) was tested for prediction of water balance and sediment yield in the Ankara gauged basin, Turkey. The overall objective of this study was to evaluate the performance and applicability of the SWAT in this region of Turkey. Thirteen years of monthly stream flow, and suspended sediment, data were used for calibration and validation. This research assessed model performance based on differences between observed and predicted suspended sediment yield during calibration (1987-1996) and validation (1982-1984) periods. Statistical comparisons of suspended sediment produced values for NSE (Nash Sutcliffe efficiency), RE (relative error), and R² (coefficient of determination), of 0.81, -1.55, and 0.93, respectively, during the calibration period, and NSE, RE (%), and R² of 0.77, -2.61, and 0.87, respectively, during the validation period. Based on the analyses, SWAT satisfactorily simulated observed hydrology and sediment yields and can be used as a tool in decision making for water resources planning and management in the basin.

Keywords: calibration, GIS, sediment yield, SWAT, validation

Procedia PDF Downloads 256
1553 Semi-Automatic Method to Assist Expert for Association Rules Validation

Authors: Amdouni Hamida, Gammoudi Mohamed Mohsen

Abstract:

In order to help the expert to validate association rules extracted from data, some quality measures are proposed in the literature. We distinguish two categories: objective and subjective measures. The first one depends on a fixed threshold and on data quality from which the rules are extracted. The second one consists on providing to the expert some tools in the objective to explore and visualize rules during the evaluation step. However, the number of extracted rules to validate remains high. Thus, the manually mining rules task is very hard. To solve this problem, we propose, in this paper, a semi-automatic method to assist the expert during the association rule's validation. Our method uses rule-based classification as follow: (i) We transform association rules into classification rules (classifiers), (ii) We use the generated classifiers for data classification. (iii) We visualize association rules with their quality classification to give an idea to the expert and to assist him during validation process.

Keywords: association rules, rule-based classification, classification quality, validation

Procedia PDF Downloads 415
1552 In situ Real-Time Multivariate Analysis of Methanolysis Monitoring of Sunflower Oil Using FTIR

Authors: Pascal Mwenge, Tumisang Seodigeng

Abstract:

The combination of world population and the third industrial revolution led to high demand for fuels. On the other hand, the decrease of global fossil 8fuels deposits and the environmental air pollution caused by these fuels has compounded the challenges the world faces due to its need for energy. Therefore, new forms of environmentally friendly and renewable fuels such as biodiesel are needed. The primary analytical techniques for methanolysis yield monitoring have been chromatography and spectroscopy, these methods have been proven reliable but are more demanding, costly and do not provide real-time monitoring. In this work, the in situ monitoring of biodiesel from sunflower oil using FTIR (Fourier Transform Infrared) has been studied; the study was performed using EasyMax Mettler Toledo reactor equipped with a DiComp (Diamond) probe. The quantitative monitoring of methanolysis was performed by building a quantitative model with multivariate calibration using iC Quant module from iC IR 7.0 software. 15 samples of known concentrations were used for the modelling which were taken in duplicate for model calibration and cross-validation, data were pre-processed using mean centering and variance scale, spectrum math square root and solvent subtraction. These pre-processing methods improved the performance indexes from 7.98 to 0.0096, 11.2 to 3.41, 6.32 to 2.72, 0.9416 to 0.9999, RMSEC, RMSECV, RMSEP and R2Cum, respectively. The R2 value of 1 (training), 0.9918 (test), 0.9946 (cross-validation) indicated the fitness of the model built. The model was tested against univariate model; small discrepancies were observed at low concentration due to unmodelled intermediates but were quite close at concentrations above 18%. The software eliminated the complexity of the Partial Least Square (PLS) chemometrics. It was concluded that the model obtained could be used to monitor methanol of sunflower oil at industrial and lab scale.

Keywords: biodiesel, calibration, chemometrics, methanolysis, multivariate analysis, transesterification, FTIR

Procedia PDF Downloads 129
1551 Validation of the Formal Model of Web Services Applications for Digital Reference Service of Library Information System

Authors: Zainab Magaji Musa, Nordin M. A. Rahman, Julaily Aida Jusoh

Abstract:

The web services applications for digital reference service (WSDRS) of LIS model is an informal model that claims to reduce the problems of digital reference services in libraries. It uses web services technology to provide efficient way of satisfying users’ needs in the reference section of libraries. The formal WSDRS model consists of the Z specifications of all the informal specifications of the model. This paper discusses the formal validation of the Z specifications of WSDRS model. The authors formally verify and thus validate the properties of the model using Z/EVES theorem prover.

Keywords: validation, verification, formal, theorem prover

Procedia PDF Downloads 488