Search results for: data mining technique
28442 Studies on Plasma Spray Deposited La2O3 - YSZ (Yttria-Stabilized Zirconia) Composite Thermal Barrier Coating
Authors: Prashant Sharma, Jyotsna Dutta Majumdar
Abstract:
The present study concerns development of a composite thermal barrier coating consisting of a mixture of La2O3 and YSZ (with 8 wt.%, 32 wt.% and 50 wt.% 50% La2O3) by plasma spray deposition technique on a CoNiCrAlY based bond coat deposited on Inconel 718 substrate by high velocity oxy-fuel deposition (HVOF) technique. The addition of La2O3 in YSZ causes the formation of pyrochlore (La2Zr2O7) phase in the inter splats boundary along with the presence of LaYO3 phase. The coefficient of thermal expansion is significantly reduced from due to the evolution of different phases and structural defects in the sprayed coating. The activation energy for TGO growth under isothermal and cyclic oxidation was increased in the composite coating as compared to YSZ coating.Keywords: plasma spraying, oxidation resistance, thermal barrier coating, microstructure, X-ray method
Procedia PDF Downloads 35328441 Extent of Derivative Usage, Firm Value and Risk: An Empirical Study on Pakistan Non-Financial Firms
Authors: Atia Alam
Abstract:
Growing liberalisation and intense market competition increase firm’s risk exposure and induce corporations to use derivatives extensively as a risk management instrument, which results in decrease in firm’s risk, and increase in value. Present study contributes towards existing literature by providing an in-depth analysis regarding the effect of extent of derivative usage on firm’s risk and value by using panel data models and seemingly unrelated regression technique. New evidence is established in current literature by dividing the sample data based on firm’s Exchange Rate (ER) and Interest Rate (IR) exposure. Analysis is performed for the effect of extent of derivative usage on firm’s risk and value and its variation with respect to the ER and IR exposure. Sample data consists of 166 Pakistani firms listed on Pakistan stock exchange for the period of 2004-2010. Results show that extensive usage of derivative instruments significantly increases firm value and reduces firm’s risk. Furthermore, comprehensive analysis depicts that Pakistani corporations having higher exchange rate exposure, with respect to foreign sales, and higher interest rate exposure, on the basis of industry adjusted leverage, have higher firm value and lower risk. Findings from seemingly unrelated regression also provide robustness to results obtained through panel data analysis. Study also highlights the role of derivative usage as a risk management instrument in high and low ER and IR risk and helps practitioners in understanding how value increasing effect of extent of derivative usage varies with the intensity of firm’s risk exposure.Keywords: extent of derivative usage, firm value, risk, Pakistan, non-financial firms
Procedia PDF Downloads 35728440 The Application of Lesson Study Model in Writing Review Text in Junior High School
Authors: Sulastriningsih Djumingin
Abstract:
This study has some objectives. It aims at describing the ability of the second-grade students to write review text without applying the Lesson Study model at SMPN 18 Makassar. Second, it seeks to describe the ability of the second-grade students to write review text by applying the Lesson Study model at SMPN 18 Makassar. Third, it aims at testing the effectiveness of the Lesson Study model in writing review text at SMPN 18 Makassar. This research was true experimental design with posttest Only group design involving two groups consisting of one class of the control group and one class of the experimental group. The research populations were all the second-grade students at SMPN 18 Makassar amounted to 250 students consisting of 8 classes. The sampling technique was purposive sampling technique. The control class was VIII2 consisting of 30 students, while the experimental class was VIII8 consisting of 30 students. The research instruments were in the form of observation and tests. The collected data were analyzed using descriptive statistical techniques and inferential statistical techniques with t-test types processed using SPSS 21 for windows. The results shows that: (1) of 30 students in control class, there are only 14 (47%) students who get the score more than 7.5, categorized as inadequate; (2) in the experimental class, there are 26 (87%) students who obtain the score of 7.5, categorized as adequate; (3) the Lesson Study models is effective to be applied in writing review text. Based on the comparison of the ability of the control class and experimental class, it indicates that the value of t-count is greater than the value of t-table (2.411> 1.667). It means that the alternative hypothesis (H1) proposed by the researcher is accepted.Keywords: application, lesson study, review text, writing
Procedia PDF Downloads 20328439 Awareness, Use and Searching Behavior of 'Virtua' Online Public Access Catalog Users
Authors: Saira Soroya, Khalid Mahmood
Abstract:
Library catalogs open the door to the library collection. OPAC (Online Public Access Catalog) are one of the services offered by automated libraries. The present study aims to explore user’s awareness, the level of use and their searching behavior of OPAC with a purpose to give suggestions and ways to improve user-friendly features of library OPAC. The population consisted of OPAC users of Lahore University of Management Sciences (LUMS). Convenient sampling technique was carried out. Total sample size was 100 OPAC users. Quantitative research design, based on survey method used to carry out the study. The data collection instrument was adopted. Data was analyzed using SPSS. Results revealed that a considerable number of users were not aware of OPAC i.e. (30%); however, those who were aware were using basic features of the OPAC. It was found that lack of knowledge was considered the frequent reason for not using all features of OPAC. In this regard, it is strongly recommended that compulsory information literacy programme should be established.Keywords: catalog, OPAC, library automation, usability study, university library
Procedia PDF Downloads 33828438 Graphitic Carbon Nitride-CeO₂ Nanocomposite for Photocatalytic Degradation of Methyl Red
Authors: Khansaa Al-Essa
Abstract:
Nanosized ceria (CeO₂) and graphitic carbon nitride-loaded ceria (CeO₂/GCN) nanocomposite have been synthesized by the coprecipitation method and studied its photocatalytic activity for methyl red degradation under Visible type radiation. A phase formation study was carried out by using an x-ray diffraction technique, and it revealed that ceria (CeO₂) is properly supported on the surface of GCN. Ceria nanoparticles and CeO₂/GCN nanocomposite were confirmed by transmission electron microscopy technique. The particle size of the CeO₂, CeO₂/GCN nanocomposite is in the range of 10-15 nm. Photocatalytic activity of the CeO₂/g-C3N4 composite was improved as compared to CeO₂. The enhanced photocatalytic activity is attributed to the increased visible light absorption and improved adsorption of the dye on the surface of the composite catalyst.Keywords: photodegradation, dye, nanocomposite, graphitic carbon nitride-CeO₂
Procedia PDF Downloads 2128437 Aeromagnetic Data Interpretation and Source Body Evaluation Using Standard Euler Deconvolution Technique in Obudu Area, Southeastern Nigeria
Authors: Chidiebere C. Agoha, Chukwuebuka N. Onwubuariri, Collins U.amasike, Tochukwu I. Mgbeojedo, Joy O. Njoku, Lawson J. Osaki, Ifeyinwa J. Ofoh, Francis B. Akiang, Dominic N. Anuforo
Abstract:
In order to interpret the airborne magnetic data and evaluate the approximate location, depth, and geometry of the magnetic sources within Obudu area using the standard Euler deconvolution method, very high-resolution aeromagnetic data over the area was acquired, processed digitally and analyzed using Oasis Montaj 8.5 software. Data analysis and enhancement techniques, including reduction to the equator, horizontal derivative, first and second vertical derivatives, upward continuation and regional-residual separation, were carried out for the purpose of detailed data Interpretation. Standard Euler deconvolution for structural indices of 0, 1, 2, and 3 was also carried out and respective maps were obtained using the Euler deconvolution algorithm. Results show that the total magnetic intensity ranges from -122.9nT to 147.0nT, regional intensity varies between -106.9nT to 137.0nT, while residual intensity ranges between -51.5nT to 44.9nT clearly indicating the masking effect of deep-seated structures over surface and shallow subsurface magnetic materials. Results also indicated that the positive residual anomalies have an NE-SW orientation, which coincides with the trend of major geologic structures in the area. Euler deconvolution for all the considered structural indices has depth to magnetic sources ranging from the surface to more than 2000m. Interpretation of the various structural indices revealed the locations and depths of the source bodies and the existence of geologic models, including sills, dykes, pipes, and spherical structures. This area is characterized by intrusive and very shallow basement materials and represents an excellent prospect for solid mineral exploration and development.Keywords: Euler deconvolution, horizontal derivative, Obudu, structural indices
Procedia PDF Downloads 8228436 Concept of Transforaminal Lumbar Interbody Fusion Cage Insertion Device
Authors: Sangram A. Sathe, Neha A. Madgulkar, Shruti S. Raut, S. P. Wadkar
Abstract:
Transforaminal lumbar interbody fusion (TLIF) surgeries have nowadays became popular for treatment of degenerated spinal disorders. The interbody fusion technique like TLIF maintains load bearing capacity of the spine and a suitable disc height. Currently many techniques have been introduced to cure Spondylolisthesis. This surgery provides greater rehabilitation of degenerative spines. While performing this TLIF surgery existing methods use guideway, which is a troublesome surgery technique as the use of two separate instruments is required to perform this surgery. This paper presents a concept which eliminates the use of guideway. This concept also eliminates problems that occur like reverting the cage. The concept discussed in this paper also gives high accuracy while performing surgery.Keywords: TLIF, spondylolisthesis, spine, instruments
Procedia PDF Downloads 33228435 Effect of Surfactant Concentration on Dissolution of Hydrodynamically Trapped Sparingly Soluble Oil Micro Droplets
Authors: Adil Mustafa, Ahmet Erten, Alper Kiraz, Melikhan Tanyeri
Abstract:
Work presented here is based on a novel experimental technique used to hydrodynamically trap oil microdroplets inside a microfluidic chip at the junction of microchannels known as stagnation point. Hydrodynamic trapping has been recently used to trap and manipulate a number of particles starting from microbeads to DNA and single cells. Benzyl Benzoate (BB) is used as droplet material. The microdroplets are trapped individually at stagnation point and their dissolution was observed. Experiments are performed for two concentrations (10mM or 10µM) of AOT surfactant (Docusate Sodium Salt) and two flow rates for each case. Moreover, experimental data is compared with Zhang-Yang-Mao (ZYM) model which studies dissolution of liquid microdroplets in the presence of a host fluid experiencing extensional creeping flow. Industrial processes like polymer blending systems in which heat or mass transport occurs experience extensional flow and an insight into these phenomena is of significant importance to many industrial processes. The experimental technique exploited here gives an insight into the dissolution of liquid microdroplets under extensional flow regime. The comparison of our experimental results with ZYM model reveals that dissolution of microdroplets at lower surfactant concentration (10µM) fits the ZYM model at saturation concentration (Cs) value reported in literature (Cs = 15×10⁻³Kg\m³) while for higher surfactant concentration (10mM) which is also above the critical micelle concentration (CMC) of surfactant (5mM) the data fits ZYM model at (Cs = 45×10⁻³Kg\m³) which is 3X times the value reported in literature. The difference in Cs value from the literature shows enhancement in dissolution rate of sparingly soluble BB microdroplets at surfactant concentrations higher than CMC. Enhancement in the dissolution of sparingly soluble materials is of great importance in pharmaceutical industry. Enhancement in the dissolution of sparingly soluble drugs is a key research area for drug design industry. The experimental method is also advantageous because it is robust and has no mechanical contact with droplets under study are freely suspended in the fluid as compared existing methods used for testing dissolution of drugs. The experiments also give an insight into CMC measurement for surfactants.Keywords: extensional flow, hydrodynamic trapping, Zhang-Yang-Mao, CMC
Procedia PDF Downloads 34628434 Implementation of Algorithm K-Means for Grouping District/City in Central Java Based on Macro Economic Indicators
Authors: Nur Aziza Luxfiati
Abstract:
Clustering is partitioning data sets into sub-sets or groups in such a way that elements certain properties have shared property settings with a high level of similarity within one group and a low level of similarity between groups. . The K-Means algorithm is one of thealgorithmsclustering as a grouping tool that is most widely used in scientific and industrial applications because the basic idea of the kalgorithm is-means very simple. In this research, applying the technique of clustering using the k-means algorithm as a method of solving the problem of national development imbalances between regions in Central Java Province based on macroeconomic indicators. The data sample used is secondary data obtained from the Central Java Provincial Statistics Agency regarding macroeconomic indicator data which is part of the publication of the 2019 National Socio-Economic Survey (Susenas) data. score and determine the number of clusters (k) using the elbow method. After the clustering process is carried out, the validation is tested using themethodsBetween-Class Variation (BCV) and Within-Class Variation (WCV). The results showed that detection outlier using z-score normalization showed no outliers. In addition, the results of the clustering test obtained a ratio value that was not high, namely 0.011%. There are two district/city clusters in Central Java Province which have economic similarities based on the variables used, namely the first cluster with a high economic level consisting of 13 districts/cities and theclustersecondwith a low economic level consisting of 22 districts/cities. And in the cluster second, namely, between low economies, the authors grouped districts/cities based on similarities to macroeconomic indicators such as 20 districts of Gross Regional Domestic Product, with a Poverty Depth Index of 19 districts, with 5 districts in Human Development, and as many as Open Unemployment Rate. 10 districts.Keywords: clustering, K-Means algorithm, macroeconomic indicators, inequality, national development
Procedia PDF Downloads 15928433 A Method for Compression of Short Unicode Strings
Authors: Masoud Abedi, Abbas Malekpour, Peter Luksch, Mohammad Reza Mojtabaei
Abstract:
The use of short texts in communication has been greatly increasing in recent years. Applying different languages in short texts has led to compulsory use of Unicode strings. These strings need twice the space of common strings, hence, applying algorithms of compression for the purpose of accelerating transmission and reducing cost is worthwhile. Nevertheless, other compression methods like gzip, bzip2 or PAQ due to high overhead data size are not appropriate. The Huffman algorithm is one of the rare algorithms effective in reducing the size of short Unicode strings. In this paper, an algorithm is proposed for compression of very short Unicode strings. At first, every new character to be sent to a destination is inserted in the proposed mapping table. At the beginning, every character is new. In case the character is repeated for the same destination, it is not considered as a new character. Next, the new characters together with the mapping value of repeated characters are arranged through a specific technique and specially formatted to be transmitted. The results obtained from an assessment made on a set of short Persian and Arabic strings indicate that this proposed algorithm outperforms the Huffman algorithm in size reduction.Keywords: Algorithms, Data Compression, Decoding, Encoding, Huffman Codes, Text Communication
Procedia PDF Downloads 34928432 Comparison Between Tension Band Wiring Using K-Wires and Cannulated Screws in Transverse Patella Fracture Fixation
Authors: Daniel Francis, Mo Yassin
Abstract:
Transverse patella fractures are routinely fixed using tension band wiring (TBW) using Kirschner wires and a wire in the shape of a figure of 8. The idea of the study was to compare the outcomes of the traditional technique against the more recently used cannulated screws and fiber tape in the shape of a figure of 8. We performed a retrospective cohort study of all the surgically fixed patella fractures from the year 2019 to 2022. The patients were divided into two groups TBW group and cannulated screws group. The primary outcome measure was the failure of fixation and the need for the removal of metalwork. Twenty-six patellar fractures were studied. TBW was used in 14 (53.8%), and cannulated screws were used for fixation in 12 (46.2%). There was one incident of metalwork failure in the TBW and one incident in the cannulated screws group. Five (35.7%) of patients in the TBW needed symptomatic metal work removed and One (8.3%) in the cannulated screw group. In both groups, the rate of fixation failure was low. Symptomatic implants, the most common complication observed, were higher in the TBW group in our practice. Although the small numbers in both groups, the hope of this study is to shine the light on the use of cannulated screws for patella fractures as it would reduce the need for a second operation and reduce the load on the already stretched services as well as improving the patient experience by not requiring further surgery. Although this is not a brand-new technique, it is not commonly used as there have not yet been any studies that demonstrate the lower rates of second surgery needed.Keywords: patella, tension band wiring, randomised, new technique
Procedia PDF Downloads 7528431 Applying the Underwriting Technique to Analyze and Mitigate the Credit Risks in Construction Project Management
Authors: Hai Chien Pham, Thi Phuong Anh Vo, Chansik Park
Abstract:
Risks management in construction projects is important to ensure the positive feasibility of the projects in which financial risks are most concerned while construction projects always run on a credit basis. Credit risks, therefore, require unique and technical tools to be well managed. Underwriting technique in credit risks, in its most basic sense, refers to the process of evaluating the risks and the potential exposure of losses. Risks analysis and underwriting are applied as a must in banks and financial institutions who are supporters for constructions projects when required. Recently, construction organizations, especially contractors, have recognized the significant increasing of credit risks which caused negative impacts to project performance and profit of construction firms. Despite the successful application of underwriting in banks and financial institutions for many years, there are few contractors who are applying this technique to analyze and mitigate the credit risks of their potential owners before signing contracts with them for delivering their performed services. Thus, contractors have taken credit risks during project implementation which might be not materialized due to the bankruptcy and/or protracted default made by their owners. With this regard, this study proposes a model using the underwriting technique for contractors to analyze and assess credit risks of their owners before making final decisions for the potential construction contracts. Contractor’s underwriters are able to analyze and evaluate the subjects such as owner, country, sector, payment terms, financial figures and their related concerns of the credit limit requests in details based on reliable information sources, and then input into the proposed model to have the Overall Assessment Score (OAS). The OAS is as a benchmark for the decision makers to grant the proper limits for the project. The proposed underwriting model is validated by 30 subjects in Asia Pacific region within 5 years to achieve their OAS, and then compare output OAS with their own practical performance in order to evaluate the potential of underwriting model for analyzing and assessing credit risks. The results revealed that the underwriting would be a powerful method to assist contractors in making precise decisions. The contribution of this research is to allow the contractors firstly to develop their own credit risk management model for proactively preventing the credit risks of construction projects and continuously improve and enhance the performance of this function during project implementation.Keywords: underwriting technique, credit risk, risk management, construction project
Procedia PDF Downloads 20928430 Parallel Fuzzy Rough Support Vector Machine for Data Classification in Cloud Environment
Authors: Arindam Chaudhuri
Abstract:
Classification of data has been actively used for most effective and efficient means of conveying knowledge and information to users. The prima face has always been upon techniques for extracting useful knowledge from data such that returns are maximized. With emergence of huge datasets the existing classification techniques often fail to produce desirable results. The challenge lies in analyzing and understanding characteristics of massive data sets by retrieving useful geometric and statistical patterns. We propose a supervised parallel fuzzy rough support vector machine (PFRSVM) for data classification in cloud environment. The classification is performed by PFRSVM using hyperbolic tangent kernel. The fuzzy rough set model takes care of sensitiveness of noisy samples and handles impreciseness in training samples bringing robustness to results. The membership function is function of center and radius of each class in feature space and is represented with kernel. It plays an important role towards sampling the decision surface. The success of PFRSVM is governed by choosing appropriate parameter values. The training samples are either linear or nonlinear separable. The different input points make unique contributions to decision surface. The algorithm is parallelized with a view to reduce training times. The system is built on support vector machine library using Hadoop implementation of MapReduce. The algorithm is tested on large data sets to check its feasibility and convergence. The performance of classifier is also assessed in terms of number of support vectors. The challenges encountered towards implementing big data classification in machine learning frameworks are also discussed. The experiments are done on the cloud environment available at University of Technology and Management, India. The results are illustrated for Gaussian RBF and Bayesian kernels. The effect of variability in prediction and generalization of PFRSVM is examined with respect to values of parameter C. It effectively resolves outliers’ effects, imbalance and overlapping class problems, normalizes to unseen data and relaxes dependency between features and labels. The average classification accuracy for PFRSVM is better than other classifiers for both Gaussian RBF and Bayesian kernels. The experimental results on both synthetic and real data sets clearly demonstrate the superiority of the proposed technique.Keywords: FRSVM, Hadoop, MapReduce, PFRSVM
Procedia PDF Downloads 49128429 Clinical Feature Analysis and Prediction on Recurrence in Cervical Cancer
Authors: Ravinder Bahl, Jamini Sharma
Abstract:
The paper demonstrates analysis of the cervical cancer based on a probabilistic model. It involves technique for classification and prediction by recognizing typical and diagnostically most important test features relating to cervical cancer. The main contributions of the research include predicting the probability of recurrences in no recurrence (first time detection) cases. The combination of the conventional statistical and machine learning tools is applied for the analysis. Experimental study with real data demonstrates the feasibility and potential of the proposed approach for the said cause.Keywords: cervical cancer, recurrence, no recurrence, probabilistic, classification, prediction, machine learning
Procedia PDF Downloads 36028428 The Design of Intelligent Classroom Management System with Raspberry PI
Authors: Sathapath Kilaso
Abstract:
Attendance checking in the classroom for student is object to record the student’s attendance in order to support the learning activities in the classroom. Despite the teaching trend in the 21st century is the student-center learning and the lecturer duty is to mentor and give an advice, the classroom learning is still important in order to let the student interact with the classmate and the lecturer or for a specific subject which the in-class learning is needed. The development of the system prototype by applied the microcontroller technology and embedded system with the “internet of thing” trend and the web socket technique will allow the lecturer to be alerted immediately whenever the data is updated.Keywords: arduino, embedded system, classroom, raspberry PI
Procedia PDF Downloads 37628427 Layout Optimization of a Start-up COVID-19 Testing Kit Manufacturing Facility
Authors: Poojan Vora, Hardik Pancholi, Sanket Tajane, Harsh Shah, Elias Keedy
Abstract:
The global COVID-19 pandemic has affected the industry drastically in many ways. Even though the vaccine is being distributed quickly and despite the decreasing number of positive cases, testing is projected to remain a key aspect of the ‘new normal’. Improving existing plant layout and improving safety within the facility are of great importance in today’s industries because of the need to ensure productivity optimization and reduce safety risks. In practice, it is essential for any manufacturing plant to reduce nonvalue adding steps such as the movement of materials and rearrange similar processes. In the current pandemic situation, optimized layouts will not only increase safety measures but also decrease the fixed cost per unit manufactured. In our case study, we carefully studied the existing layout and the manufacturing steps of a new Texas start-up company that manufactures COVID testing kits. The effects of production rate are incorporated with the computerized relative allocation of facilities technique (CRAFT) algorithm to improve the plant layout and estimate the optimization parameters. Our work reduces the company’s material handling time and increases their daily production. Real data from the company are used in the case study to highlight the importance of colleges in fostering small business needs and improving the collaboration between college researchers and industries by using existing models to advance best practices.Keywords: computerized relative allocation of facilities technique, facilities planning, optimization, start-up business
Procedia PDF Downloads 13928426 Defect Classification of Hydrogen Fuel Pressure Vessels using Deep Learning
Authors: Dongju Kim, Youngjoo Suh, Hyojin Kim, Gyeongyeong Kim
Abstract:
Acoustic Emission Testing (AET) is widely used to test the structural integrity of an operational hydrogen storage container, and clustering algorithms are frequently used in pattern recognition methods to interpret AET results. However, the interpretation of AET results can vary from user to user as the tuning of the relevant parameters relies on the user's experience and knowledge of AET. Therefore, it is necessary to use a deep learning model to identify patterns in acoustic emission (AE) signal data that can be used to classify defects instead. In this paper, a deep learning-based model for classifying the types of defects in hydrogen storage tanks, using AE sensor waveforms, is proposed. As hydrogen storage tanks are commonly constructed using carbon fiber reinforced polymer composite (CFRP), a defect classification dataset is collected through a tensile test on a specimen of CFRP with an AE sensor attached. The performance of the classification model, using one-dimensional convolutional neural network (1-D CNN) and synthetic minority oversampling technique (SMOTE) data augmentation, achieved 91.09% accuracy for each defect. It is expected that the deep learning classification model in this paper, used with AET, will help in evaluating the operational safety of hydrogen storage containers.Keywords: acoustic emission testing, carbon fiber reinforced polymer composite, one-dimensional convolutional neural network, smote data augmentation
Procedia PDF Downloads 9528425 Assessment of Forest Above Ground Biomass Through Linear Modeling Technique Using SAR Data
Authors: Arjun G. Koppad
Abstract:
The study was conducted in Joida taluk of Uttara Kannada district, Karnataka, India, to assess the land use land cover (LULC) and forest aboveground biomass using L band SAR data. The study area covered has dense, moderately dense, and sparse forests. The sampled area was 0.01 percent of the forest area with 30 sampling plots which were selected randomly. The point center quadrate (PCQ) method was used to select the tree and collected the tree growth parameters viz., tree height, diameter at breast height (DBH), and diameter at the tree base. The tree crown density was measured with a densitometer. Each sample plot biomass was estimated using the standard formula. In this study, the LULC classification was done using Freeman-Durden, Yamaghuchi and Pauli polarimetric decompositions. It was observed that the Freeman-Durden decomposition showed better LULC classification with an accuracy of 88 percent. An attempt was made to estimate the aboveground biomass using SAR backscatter. The ALOS-2 PALSAR-2 L-band data (HH, HV, VV &VH) fully polarimetric quad-pol SAR data was used. SAR backscatter-based regression model was implemented to retrieve forest aboveground biomass of the study area. Cross-polarization (HV) has shown a good correlation with forest above-ground biomass. The Multi Linear Regression analysis was done to estimate aboveground biomass of the natural forest areas of the Joida taluk. The different polarizations (HH &HV, VV &HH, HV & VH, VV&VH) combination of HH and HV polarization shows a good correlation with field and predicted biomass. The RMSE and value for HH & HV and HH & VV were 78 t/ha and 0.861, 81 t/ha and 0.853, respectively. Hence the model can be recommended for estimating AGB for the dense, moderately dense, and sparse forest.Keywords: forest, biomass, LULC, back scatter, SAR, regression
Procedia PDF Downloads 2828424 The Impact of System and Data Quality on Organizational Success in the Kingdom of Bahrain
Authors: Amal M. Alrayes
Abstract:
Data and system quality play a central role in organizational success, and the quality of any existing information system has a major influence on the effectiveness of overall system performance.Given the importance of system and data quality to an organization, it is relevant to highlight their importance on organizational performance in the Kingdom of Bahrain. This research aims to discover whether system quality and data quality are related, and to study the impact of system and data quality on organizational success. A theoretical model based on previous research is used to show the relationship between data and system quality, and organizational impact. We hypothesize, first, that system quality is positively associated with organizational impact, secondly that system quality is positively associated with data quality, and finally that data quality is positively associated with organizational impact. A questionnaire was conducted among public and private organizations in the Kingdom of Bahrain. The results show that there is a strong association between data and system quality, that affects organizational success.Keywords: data quality, performance, system quality, Kingdom of Bahrain
Procedia PDF Downloads 49828423 Portfolio Management for Construction Company during Covid-19 Using AHP Technique
Authors: Sareh Rajabi, Salwa Bheiry
Abstract:
In general, Covid-19 created many financial and non-financial damages to the economy and community. Level and severity of covid-19 as pandemic case varies over the region and due to different types of the projects. Covid-19 virus emerged as one of the most imperative risk management factors word-wide recently. Therefore, as part of portfolio management assessment, it is essential to evaluate severity of such risk on the project and program in portfolio management level to avoid any risky portfolio. Covid-19 appeared very effectively in South America, part of Europe and Middle East. Such pandemic infection affected the whole universe, due to lock down, interruption in supply chain management, health and safety requirements, transportations and commercial impacts. Therefore, this research proposes Analytical Hierarchy Process (AHP) to analyze and assess such pandemic case like Covid-19 and its impacts on the construction projects. The AHP technique uses four sub-criteria: Health and safety, commercial risk, completion risk and contractual risk to evaluate the project and program. The result will provide the decision makers with information which project has higher or lower risk in case of Covid-19 and pandemic scenario. Therefore, the decision makers can have most feasible solution based on effective weighted criteria for project selection within their portfolio to match with the organization’s strategies.Keywords: portfolio management, risk management, COVID-19, analytical hierarchy process technique
Procedia PDF Downloads 11028422 Power Iteration Clustering Based on Deflation Technique on Large Scale Graphs
Authors: Taysir Soliman
Abstract:
One of the current popular clustering techniques is Spectral Clustering (SC) because of its advantages over conventional approaches such as hierarchical clustering, k-means, etc. and other techniques as well. However, one of the disadvantages of SC is the time consuming process because it requires computing the eigenvectors. In the past to overcome this disadvantage, a number of attempts have been proposed such as the Power Iteration Clustering (PIC) technique, which is one of versions from SC; some of PIC advantages are: 1) its scalability and efficiency, 2) finding one pseudo-eigenvectors instead of computing eigenvectors, and 3) linear combination of the eigenvectors in linear time. However, its worst disadvantage is an inter-class collision problem because it used only one pseudo-eigenvectors which is not enough. Previous researchers developed Deflation-based Power Iteration Clustering (DPIC) to overcome problems of PIC technique on inter-class collision with the same efficiency of PIC. In this paper, we developed Parallel DPIC (PDPIC) to improve the time and memory complexity which is run on apache spark framework using sparse matrix. To test the performance of PDPIC, we compared it to SC, ESCG, ESCALG algorithms on four small graph benchmark datasets and nine large graph benchmark datasets, where PDPIC proved higher accuracy and better time consuming than other compared algorithms.Keywords: spectral clustering, power iteration clustering, deflation-based power iteration clustering, Apache spark, large graph
Procedia PDF Downloads 19128421 Cross-border Data Transfers to and from South Africa
Authors: Amy Gooden, Meshandren Naidoo
Abstract:
Genetic research and transfers of big data are not confined to a particular jurisdiction, but there is a lack of clarity regarding the legal requirements for importing and exporting such data. Using direct-to-consumer genetic testing (DTC-GT) as an example, this research assesses the status of data sharing into and out of South Africa (SA). While SA laws cover the sending of genetic data out of SA, prohibiting such transfer unless a legal ground exists, the position where genetic data comes into the country depends on the laws of the country from where it is sent – making the legal position less clear.Keywords: cross-border, data, genetic testing, law, regulation, research, sharing, South Africa
Procedia PDF Downloads 12728420 An Ensemble System of Classifiers for Computer-Aided Volcano Monitoring
Authors: Flavio Cannavo
Abstract:
Continuous evaluation of the status of potentially hazardous volcanos plays a key role for civil protection purposes. The importance of monitoring volcanic activity, especially for energetic paroxysms that usually come with tephra emissions, is crucial not only for exposures to the local population but also for airline traffic. Presently, real-time surveillance of most volcanoes worldwide is essentially delegated to one or more human experts in volcanology, who interpret data coming from different kind of monitoring networks. Unfavorably, the high nonlinearity of the complex and coupled volcanic dynamics leads to a large variety of different volcanic behaviors. Moreover, continuously measured parameters (e.g. seismic, deformation, infrasonic and geochemical signals) are often not able to fully explain the ongoing phenomenon, thus making the fast volcano state assessment a very puzzling task for the personnel on duty at the control rooms. With the aim of aiding the personnel on duty in volcano surveillance, here we introduce a system based on an ensemble of data-driven classifiers to infer automatically the ongoing volcano status from all the available different kind of measurements. The system consists of a heterogeneous set of independent classifiers, each one built with its own data and algorithm. Each classifier gives an output about the volcanic status. The ensemble technique allows weighting the single classifier output to combine all the classifications into a single status that maximizes the performance. We tested the model on the Mt. Etna (Italy) case study by considering a long record of multivariate data from 2011 to 2015 and cross-validated it. Results indicate that the proposed model is effective and of great power for decision-making purposes.Keywords: Bayesian networks, expert system, mount Etna, volcano monitoring
Procedia PDF Downloads 24728419 Colloidal Gas Aphron Generated by a Cationic Surfactant as an Alternative Technique to Recovery Natural Colorants from Fermented Broth
Authors: V. C. Santos-Ebinuma, J. F. B. Pereira, M. F. S. Teixeira, A. Pessoa Jr., P. Jauregi
Abstract:
There is worldwide interest in process development for colorants production from natural sources. Microorganisms provide an alternative source of natural colorants which can be produced by cultivation technology and extracted from fermented broth. The aim of the present work was to study the recovery of red colorants from fermented broth of Penicillium purpurogenum DPUA 1275 using the technique of Colloidal Gas Aphrons (CGA); CGA are surfactant-stabilized microbubbles generated by intense stirring of a surfactant solution. CGA were generated by the cationic, hexadecyl trimethyl ammonium bromide (CTAB) surfactant. Firstly, experiments were carried out at different surfactant/fermented broth volumetric ratios (VCGA/VFB, VRATIO) varying between 3 and 18 at pH 6.9. Secondly, the experiments were carried out at VRATIO of 6 and 12 in different pH, namely, 6.9, 8.0, 9.0 and 10.0. The first results of recovery showed that an increase in the VRATIO from 3 to 6 and 12 promoted an increase as recovery as partition coefficient. However, at VRATIO of 18 the lowest partition coefficient was obtained. The best results were achieved at VRATIO of 6 and 12, namely recovery, Re, around 60% and partition coefficient, K, of 2.5 and 3.0 to 6 and 12 VRATIO, respectively. The second set of experiments showed that the pH 9.0 promoted the best results at VRATIO of 12 as follow: Re=70%, K=5.39, proteins and sugar selectivity (SePROT, 3.75 and SeSUGAR, 7.20, respectively). These results indicate that with CTAB the recovery is mainly driven by electrostatic interactions. In conclusion, the results above show that CGA employing a cationic surfactant is a promissory technique and it can be used as the first step of purification to recovery red colorants from fermented broth.Keywords: liquid-liquid extraction, colloidal gas aphrons, recovery, natural colorants
Procedia PDF Downloads 35428418 The Study of Security Techniques on Information System for Decision Making
Authors: Tejinder Singh
Abstract:
Information system is the flow of data from different levels to different directions for decision making and data operations in information system (IS). Data can be violated by different manner like manual or technical errors, data tampering or loss of integrity. Security system called firewall of IS is effected by such type of violations. The flow of data among various levels of Information System is done by networking system. The flow of data on network is in form of packets or frames. To protect these packets from unauthorized access, virus attacks, and to maintain the integrity level, network security is an important factor. To protect the data to get pirated, various security techniques are used. This paper represents the various security techniques and signifies different harmful attacks with the help of detailed data analysis. This paper will be beneficial for the organizations to make the system more secure, effective, and beneficial for future decisions making.Keywords: information systems, data integrity, TCP/IP network, vulnerability, decision, data
Procedia PDF Downloads 30828417 Data Integration with Geographic Information System Tools for Rural Environmental Monitoring
Authors: Tamas Jancso, Andrea Podor, Eva Nagyne Hajnal, Peter Udvardy, Gabor Nagy, Attila Varga, Meng Qingyan
Abstract:
The paper deals with the conditions and circumstances of integration of remotely sensed data for rural environmental monitoring purposes. The main task is to make decisions during the integration process when we have data sources with different resolution, location, spectral channels, and dimension. In order to have exact knowledge about the integration and data fusion possibilities, it is necessary to know the properties (metadata) that characterize the data. The paper explains the joining of these data sources using their attribute data through a sample project. The resulted product will be used for rural environmental analysis.Keywords: remote sensing, GIS, metadata, integration, environmental analysis
Procedia PDF Downloads 12228416 Multi-Stage Classification for Lung Lesion Detection on CT Scan Images Applying Medical Image Processing Technique
Authors: Behnaz Sohani, Sahand Shahalinezhad, Amir Rahmani, Aliyu Aliyu
Abstract:
Recently, medical imaging and specifically medical image processing is becoming one of the most dynamically developing areas of medical science. It has led to the emergence of new approaches in terms of the prevention, diagnosis, and treatment of various diseases. In the process of diagnosis of lung cancer, medical professionals rely on computed tomography (CT) scans, in which failure to correctly identify masses can lead to incorrect diagnosis or sampling of lung tissue. Identification and demarcation of masses in terms of detecting cancer within lung tissue are critical challenges in diagnosis. In this work, a segmentation system in image processing techniques has been applied for detection purposes. Particularly, the use and validation of a novel lung cancer detection algorithm have been presented through simulation. This has been performed employing CT images based on multilevel thresholding. The proposed technique consists of segmentation, feature extraction, and feature selection and classification. More in detail, the features with useful information are selected after featuring extraction. Eventually, the output image of lung cancer is obtained with 96.3% accuracy and 87.25%. The purpose of feature extraction applying the proposed approach is to transform the raw data into a more usable form for subsequent statistical processing. Future steps will involve employing the current feature extraction method to achieve more accurate resulting images, including further details available to machine vision systems to recognise objects in lung CT scan images.Keywords: lung cancer detection, image segmentation, lung computed tomography (CT) images, medical image processing
Procedia PDF Downloads 10128415 The Study of the Absorption and Translocation of Chromium by Lygeum spartum in the Mining Region of Djebel Hamimat and Soil-Plant Interaction
Authors: H. Khomri, A. Bentellis
Abstract:
Since century of the Development Activities extraction and a dispersed mineral processing Toxic metals and much more contaminated vast areas occupied by what they natural outcrops. New types of metalliferous habitats are so appeared. A species that is Lygeum spartum attracted our curiosity because apart from its valuable role in desertification, it is apparently able to exclude antimony and other metals can be. This species, green leaf blades which are provided as cattle feed, would be a good subject for phytoremediation of mineral soils. The study of absorption and translocation of chromium by the Lygeum spartum in the mining region of Djebel Hamimat and the interaction soil-plant, revealed that soils of this species living in this region are alkaline, calcareous majority in their fine texture medium and saline in their minority. They have normal levels of organic matter. They are moderately rich in nitrogen. They contain total chromium content reaches a maximum of 66,80 mg Kg^(-1) and a total absence of soluble chromium. The results of the analysis of variance of the difference between bare soils and soils appear Lygeum spartum made a significant difference only for the silt and organic matter. But for the other variables analyzed this difference is not significant. Thus, this plant has only one action on the amendment, only the levels of silt and organic matter in soils. The results of the multiple regression of the chromium content of the roots according to all soil variables studied did appear that among the studied variables included in the model, only the electrical conductivity and clay occur in the explanation of contents chromium in roots. The chromium content of the aerial parts analyzed by regression based on all studied soil variables allows us to see only the variables: electrical conductivity and content of chromium in the root portion involved in the explanation of the content chromium in the aerial part.Keywords: absorption, translocation, analysis of variance, chrome, Lygeum spartum, multiple regression, the soil variables
Procedia PDF Downloads 27128414 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic
Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi
Abstract:
In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing
Procedia PDF Downloads 30028413 Forthcoming Big Data on Smart Buildings and Cities: An Experimental Study on Correlations among Urban Data
Authors: Yu-Mi Song, Sung-Ah Kim, Dongyoun Shin
Abstract:
Cities are complex systems of diverse and inter-tangled activities. These activities and their complex interrelationships create diverse urban phenomena. And such urban phenomena have considerable influences on the lives of citizens. This research aimed to develop a method to reveal the causes and effects among diverse urban elements in order to enable better understanding of urban activities and, therefrom, to make better urban planning strategies. Specifically, this study was conducted to solve a data-recommendation problem found on a Korean public data homepage. First, a correlation analysis was conducted to find the correlations among random urban data. Then, based on the results of that correlation analysis, the weighted data network of each urban data was provided to people. It is expected that the weights of urban data thereby obtained will provide us with insights into cities and show us how diverse urban activities influence each other and induce feedback.Keywords: big data, machine learning, ontology model, urban data model
Procedia PDF Downloads 419