Search results for: survival data analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 41926

Search results for: survival data analysis

41296 The World in the 21st Century and Beyound: Convergence or Invariance

Authors: Saleh Maina

Abstract:

There is an on-going debate among intellectuals and scholars of international relations and world politics over the direction which the world is heading particularly in the current era of globalization. On the one hand are adherents to the convergence thesis which is premised on the assumption that global social order is tending toward universalism which could translate into the possible end of the classical state system and the unification of world societies under a single and common ideological dispensation. The convergence thesis is hinged on the globalization process which is gradually reducing world societies into a 'global village'. On the other hand are intellectuals who hold the view that despite advances made in communication technology which appear to threaten the survival of the classical state system. Invariance, as expressed in the survival of the existing state system and the diverse social traditions in world societies, remain a realistic possibility contrary to the conclusions of the convergence thesis. The invariance thesis has been advanced by scholars like Samuel P. Huntington whose work on clash of civilizations suggests that world peace can only be sustained through the co-habitation of diverse civilizations across the world. The purpose of this paper is to examine both sides of the debate with the aim of making a realistic assessment on where world societies are headed, between convergence and invariance. Using the realist theory of international relations as our theoretical premise the paper argues that while there is sufficient ground to predict the future direction of world societies as headed towards some form of convergence, invariance as expressed in the co-existence of diverse civilizations will for a long time remain a major feature of the international system.

Keywords: convergence, invariance, clash of civilization, classical state system, universalism

Procedia PDF Downloads 300
41295 Fuzzy Approach for Fault Tree Analysis of Water Tube Boiler

Authors: Syed Ahzam Tariq, Atharva Modi

Abstract:

This paper presents a probabilistic analysis of the safety of water tube boilers using fault tree analysis (FTA). A fault tree has been constructed by considering all possible areas where a malfunction could lead to a boiler accident. Boiler accidents are relatively rare, causing a scarcity of data. The fuzzy approach is employed to perform a quantitative analysis, wherein theories of fuzzy logic are employed in conjunction with expert elicitation to calculate failure probabilities. The Fuzzy Fault Tree Analysis (FFTA) provides a scientific and contingent method to forecast and prevent accidents.

Keywords: fault tree analysis water tube boiler, fuzzy probability score, failure probability

Procedia PDF Downloads 112
41294 A Study on the HTML5 Based Multi Media Contents Authority Tool

Authors: Heesuk Seo, Yongtae Kim

Abstract:

Online learning started in the 1990s, the spread of the Internet has been through the era of e-learning paradigm of online education in the era of smart learning change. Reflecting the different nature of the mobile to anywhere anytime, anywhere was also allows the form of learning, it was also available through the learning content and interaction. We are developing a cloud system, 'TLINKS CLOUD' that allows you to configure the environment of the smart learning without the need for additional infrastructure. Using the big-data analysis for e-learning contents, we provide an integrated solution for e-learning tailored to individual study.

Keywords: authority tool, big data analysis, e-learning, HTML5

Procedia PDF Downloads 401
41293 JavaScript Object Notation Data against eXtensible Markup Language Data in Software Applications a Software Testing Approach

Authors: Theertha Chandroth

Abstract:

This paper presents a comparative study on how to check JSON (JavaScript Object Notation) data against XML (eXtensible Markup Language) data from a software testing point of view. JSON and XML are widely used data interchange formats, each with its unique syntax and structure. The objective is to explore various techniques and methodologies for validating comparison and integration between JSON data to XML and vice versa. By understanding the process of checking JSON data against XML data, testers, developers and data practitioners can ensure accurate data representation, seamless data interchange, and effective data validation.

Keywords: XML, JSON, data comparison, integration testing, Python, SQL

Procedia PDF Downloads 128
41292 Multichannel Analysis of the Surface Waves of Earth Materials in Some Parts of Lagos State, Nigeria

Authors: R. B. Adegbola, K. F. Oyedele, L. Adeoti

Abstract:

We present a method that utilizes Multi-channel Analysis of Surface Waves, which was used to measure shear wave velocities with a view to establishing the probable causes of road failure, subsidence and weakening of structures in some Local Government Area, Lagos, Nigeria. Multi channel Analysis of Surface waves (MASW) data were acquired using 24-channel seismograph. The acquired data were processed and transformed into two-dimensional (2-D) structure reflective of depth and surface wave velocity distribution within a depth of 0–15m beneath the surface using SURFSEIS software. The shear wave velocity data were compared with other geophysical/borehole data that were acquired along the same profile. The comparison and correlation illustrates the accuracy and consistency of MASW derived-shear wave velocity profiles. Rigidity modulus and N-value were also generated. The study showed that the low velocity/very low velocity are reflective of organic clay/peat materials and thus likely responsible for the failed, subsidence/weakening of structures within the study areas.

Keywords: seismograph, road failure, rigidity modulus, N-value, subsidence

Procedia PDF Downloads 356
41291 Recommendations Using Online Water Quality Sensors for Chlorinated Drinking Water Monitoring at Drinking Water Distribution Systems Exposed to Glyphosate

Authors: Angela Maria Fasnacht

Abstract:

Detection of anomalies due to contaminants’ presence, also known as early detection systems in water treatment plants, has become a critical point that deserves an in-depth study for their improvement and adaptation to current requirements. The design of these systems requires a detailed analysis and processing of the data in real-time, so it is necessary to apply various statistical methods appropriate to the data generated, such as Spearman’s Correlation, Factor Analysis, Cross-Correlation, and k-fold Cross-validation. Statistical analysis and methods allow the evaluation of large data sets to model the behavior of variables; in this sense, statistical treatment or analysis could be considered a vital step to be able to develop advanced models focused on machine learning that allows optimized data management in real-time, applied to early detection systems in water treatment processes. These techniques facilitate the development of new technologies used in advanced sensors. In this work, these methods were applied to identify the possible correlations between the measured parameters and the presence of the glyphosate contaminant in the single-pass system. The interaction between the initial concentration of glyphosate and the location of the sensors on the reading of the reported parameters was studied.

Keywords: glyphosate, emergent contaminants, machine learning, probes, sensors, predictive

Procedia PDF Downloads 110
41290 Big Data: Appearance and Disappearance

Authors: James Moir

Abstract:

The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.

Keywords: big data, appearance, disappearance, surface, epistemology

Procedia PDF Downloads 409
41289 Diversity and Ecological Analysis of Vascular Epiphytes in Gera Wild Coffee Forest, Jimma Zone of Oromia Regional State, Ethiopia

Authors: Bedilu Tafesse

Abstract:

The diversity and ecological analysis of vascular epiphytes was studied in Gera Forest in southwestern Ethiopia at altitudes between 1600 and 2400 m.a.s.l. A total area of 4.5 ha was surveyed in coffee and non-coffee forest vegetation. Fifty sampling plots, each 30 m x 30 m (900 m2), were used for the purpose of data collection. A total of 59 species of vascular epiphytes were recorded, of which 34 (59%) were holo epiphytes, two (4%) were hemi epiphytes and 22 (37%) species were accidental vascular epiphytes. To study the altitudinal distribution of vascular epiphytes, altitudes were classified into higher >2000, middle 1800-2000 and lower 1600-1800 m.a.s.l. According to Shannon-Wiener Index (H/= 3.411) of alpha diversity the epiphyte community in the study area is medium. There was a statistically significant difference between host bark type and epiphyte richness as determined by one-way ANOVA p = 0.001 < 0.05. The post-hoc test shows that there is significant difference of vascular epiphytes richness between smooth bark with rough, flack and corky bark (P =0.001< 0.05), as well as rough and cork bark (p =0.43 <0.05). However, between rough and flack bark (p = 0.753 > 0.05) and between flack and corky bark (p = 0.854 > 0.05) no significant difference of epiphyte abundance was observed. Rough bark had 38%, corky 26%, flack 25%, and only 11% vascular epiphytes abundance occurred on smooth bark. The regression correlation test, (R2 = 0.773, p = 0.0001 < 0.05), showed that the number of species of vascular epiphytes and host DBH size are positively correlated. The regression correlation test (R2 = 0.28, p = 0.0001 < 0.05), showed that the number of species and host tree height positively correlated. The host tree preference of vascular epiphytes was recorded for only Vittaria volkensii species hosted on Syzygium guineense trees. The result of similarity analysis indicated that Gera Forest showed the highest vascular epiphytic similarity (0.35) with Yayu Forest and shared the least vascular epiphytic similarity (0.295) with Harenna Forest. It was concluded that horizontal stems and branches, large and rough, flack and corky bark type trees are more suitable for vascular epiphytes seedling attachments and growth. Conservation and protection of these phorophytes are important for the survival of vascular epiphytes and increase their ecological importance.

Keywords: accidental epiphytes, hemiepiphyte, holoepiphyte, phorophyte

Procedia PDF Downloads 323
41288 Statistical Correlation between Logging-While-Drilling Measurements and Wireline Caliper Logs

Authors: Rima T. Alfaraj, Murtadha J. Al Tammar, Khaqan Khan, Khalid M. Alruwaili

Abstract:

OBJECTIVE/SCOPE (25-75): Caliper logging data provides critical information about wellbore shape and deformations, such as stress-induced borehole breakouts or washouts. Multiarm mechanical caliper logs are often run using wireline, which can be time-consuming, costly, and/or challenging to run in certain formations. To minimize rig time and improve operational safety, it is valuable to develop analytical solutions that can estimate caliper logs using available Logging-While-Drilling (LWD) data without the need to run wireline caliper logs. As a first step, the objective of this paper is to perform statistical analysis using an extensive datasetto identify important physical parameters that should be considered in developing such analytical solutions. METHODS, PROCEDURES, PROCESS (75-100): Caliper logs and LWD data of eleven wells, with a total of more than 80,000 data points, were obtained and imported into a data analytics software for analysis. Several parameters were selected to test the relationship of the parameters with the measured maximum and minimum caliper logs. These parameters includegamma ray, porosity, shear, and compressional sonic velocities, bulk densities, and azimuthal density. The data of the eleven wells were first visualized and cleaned.Using the analytics software, several analyses were then preformed, including the computation of Pearson’s correlation coefficients to show the statistical relationship between the selected parameters and the caliper logs. RESULTS, OBSERVATIONS, CONCLUSIONS (100-200): The results of this statistical analysis showed that some parameters show good correlation to the caliper log data. For instance, the bulk density and azimuthal directional densities showedPearson’s correlation coefficients in the range of 0.39 and 0.57, which wererelatively high when comparedto the correlation coefficients of caliper data with other parameters. Other parameters such as porosity exhibited extremely low correlation coefficients to the caliper data. Various crossplots and visualizations of the data were also demonstrated to gain further insights from the field data. NOVEL/ADDITIVE INFORMATION (25-75): This study offers a unique and novel look into the relative importance and correlation between different LWD measurements and wireline caliper logs via an extensive dataset. The results pave the way for a more informed development of new analytical solutions for estimating the size and shape of the wellbore in real-time while drilling using LWD data.

Keywords: LWD measurements, caliper log, correlations, analysis

Procedia PDF Downloads 111
41287 Effects of AG1 and AG2 QTLs on Rice Seedling Growth and Physiological Processes during Germination in Flooded Soils

Authors: Satyen Mondal, Frederickson Entila, Shalabh Dixit, Pompe C. Sta. Cruz, Abdelbagi M. Ismail

Abstract:

Anaerobic condition caused by flooding during germination in direct seeded rice systems, known as anaerobic germination (AG), severely reduces crop establishment in both rainfed and irrigated areas. Seeds germinating in flooded soils could encounter hypoxia or even anoxia in severe cases, and this hinders germination and seedling growth. This study was conducted to quantify the effects of incorporating two major QTLs, AG1 and AG2, associated with tolerance of flooding during germination and to assess their interactive effects on enhancing crop establishment. A greenhouse experiment was conducted at the International Rice Research Institute (IRRI), Los Baňos, Philippines, using elite lines incorporating AG1, AG2 and AG1+AG2 in the background of the popular varieties PSBRc82 (PSBRc82-AG1, PSBRc82-AG2, PSBRc82-AG1+AG2) and Ciherang-Sub1 (Ciherang-Sub1-AG1, Ciherang-Sub1-AG2, Ciherang-Sub1-AG1+AG2), along with the donors Kho Hlan On (for AG1) and Ma-Zhan Red (AG2) and the recipients PSBRc82 and Ciherang-Sub1. The experiment was conducted using concrete tanks in an RCBD with three replications. Dry seeds were sown in seedling trays then flooded with 10 cm water depth. Seedling survival, root and shoot growth and relative growth rate were measured. The germinating seedlings were used for assaying nonstructural carbohydrate (NSC) and ascorbate concentrations, lipid peroxidation, total phenolic concentration, reactive oxygen species and total amylase enzyme activity. Flooding reduced overall survival, though survival of AG1+AG2 introgression lines was greater than other genotypes. Soluble sugars increased, while starch concentration decreased gradually under flooding especially in the tolerant checks and AG1+AG2 introgression lines. Less lipid peroxidation and higher amylase activity, reduced-ascorbate (RAsA) and total phenolic contents (TPC) were observed in the tolerant checks and in AG1+AG2 introgression lines. Lipid peroxidation correlated negatively with ascorbate and total phenolic concentrations and with reactive oxygen species (ROS). Introgression of AG1+AG2 QTLs upregulated total amylase activity causing rapid starch degradation and increase in ascorbate and total phenolic concentrations resulting in higher germination and seedling growth in flooded soils.

Keywords: amylase, anaerobic germination, ascorbate, direct-seeded rice, flooding, lipid peroxidation

Procedia PDF Downloads 271
41286 Cross Project Software Fault Prediction at Design Phase

Authors: Pradeep Singh, Shrish Verma

Abstract:

Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. The earlier we predict the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven data sets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.

Keywords: software metrics, fault prediction, cross project, within project.

Procedia PDF Downloads 334
41285 Performance Analysis of Multichannel OCDMA-FSO Network under Different Pervasive Conditions

Authors: Saru Arora, Anurag Sharma, Harsukhpreet Singh

Abstract:

To meet the growing need of high data rate and bandwidth, various efforts has been made nowadays for the efficient communication systems. Optical Code Division Multiple Access over Free space optics communication system seems an effective role for providing transmission at high data rate with low bit error rate and low amount of multiple access interference. This paper demonstrates the OCDMA over FSO communication system up to the range of 7000 m at a data rate of 5 Gbps. Initially, the 8 user OCDMA-FSO system is simulated and pseudo orthogonal codes are used for encoding. Also, the simulative analysis of various performance parameters like power and core effective area that are having an effect on the Bit error rate (BER) of the system is carried out. The simulative analysis reveals that the length of the transmission is limited by the multi-access interference (MAI) effect which arises when the number of users increases in the system.

Keywords: FSO, PSO, bit error rate (BER), opti system simulation, multiple access interference (MAI), q-factor

Procedia PDF Downloads 360
41284 Investigation of the Effects of Monoamine Oxidase Levels on the 20S Proteasome

Authors: Bhavini Patel, Aslihan Ugun-Klusek, Ellen Billet

Abstract:

The two main contributing factors to familial and idiopathic form of Parkinson’s disease (PD) are oxidative stress and altered proteolysis. Monoamine oxidase-A (MAO-A) plays a significant role in redox homeostasis by producing reactive oxygen species (ROS) via deamination of for example, dopamine. The ROS generated induces chemical modification of proteins resulting in altered biological function. The ubiquitin-proteasome system, which consists of three different types or proteolytic activity, namely “chymotrypsin-like” activity (CLA), “trypsin-like” activity (TLA) and “post acidic-like” activity (PLA), is responsible for the degradation of ubiquitinated proteins. Defects in UPS are known to be strongly correlated to PD. Herein, the effect of ROS generated by MAO-A on proteasome activity and the effects of proteasome inhibition on MAO-A protein levels in WT, mock and MAO-A overexpressed (MAO-A+) SHSY5Y neuroblastoma cell lines were investigated. The data in this study report increased proteolytic activity when MAO-A protein levels are significantly increased, in particular CLA and PLA. Additionally, 20S proteasome inhibition induced a decrease in MAO-A levels in WT and mock cells in comparison to MAO-A+ cells in which 20S proteasome inhibition induced increased MAO-A levels to be further increased at 48 hours of inhibition. This study supports the fact that MAO-A could be a potential pharmaceutical target for neuronal protection as data suggests that endogenous MAO-A levels may be essential for modulating cell death and survival.

Keywords: monoamine oxidase, neurodegeneration, Parkinson's disease, proteasome

Procedia PDF Downloads 127
41283 Dynamic Analysis and Vibration Response of Thermoplastic Rolling Elements in a Rotor Bearing System

Authors: Nesrine Gaaliche

Abstract:

This study provides a finite element dynamic model for analyzing rolling bearing system vibration response. The vibration responses of polypropylene bearings with and without defects are studied using FE analysis and compared to experimental data. The viscoelastic behavior of thermoplastic is investigated in this work to evaluate the influence of material flexibility and damping viscosity. The vibrations are detected using 3D dynamic analysis. Peak vibrations are more noticeable in an inner ring defect than in an outer ring defect, according to test data. The performance of thermoplastic bearings is compared to that of metal parts using vibration signals. Both the test and numerical results show that Polypropylene bearings exhibit less vibration than steel counterparts. Unlike bearings made from metal, polypropylene bearings absorb vibrations and handle shaft misalignments. Following validation of the overall vibration spectrum data, Von Mises stresses inside the rings are assessed under high loads. Stress is significantly high under the balls, according to the simulation findings. For the test cases, the computational findings correspond closely to the experimental results.

Keywords: viscoelastic, FE analysis, polypropylene, bearings

Procedia PDF Downloads 97
41282 Infrastructural Investment and Economic Growth in Indian States: A Panel Data Analysis

Authors: Jonardan Koner, Basabi Bhattacharya, Avinash Purandare

Abstract:

The study is focused to find out the impact of infrastructural investment on economic development in Indian states. The study uses panel data analysis to measure the impact of infrastructural investment on Real Gross Domestic Product in Indian States. Panel data analysis incorporates Unit Root Test, Cointegration Teat, Pooled Ordinary Least Squares, Fixed Effect Approach, Random Effect Approach, Hausman Test. The study analyzes panel data (annual in frequency) ranging from 1991 to 2012 and concludes that infrastructural investment has a desirable impact on economic development in Indian. Finally, the study reveals that the infrastructural investment significantly explains the variation of economic indicator.

Keywords: infrastructural investment, real GDP, unit root test, cointegration teat, pooled ordinary least squares, fixed effect approach, random effect approach, Hausman test

Procedia PDF Downloads 397
41281 Reviewing Privacy Preserving Distributed Data Mining

Authors: Sajjad Baghernezhad, Saeideh Baghernezhad

Abstract:

Nowadays considering human involved in increasing data development some methods such as data mining to extract science are unavoidable. One of the discussions of data mining is inherent distribution of the data usually the bases creating or receiving such data belong to corporate or non-corporate persons and do not give their information freely to others. Yet there is no guarantee to enable someone to mine special data without entering in the owner’s privacy. Sending data and then gathering them by each vertical or horizontal software depends on the type of their preserving type and also executed to improve data privacy. In this study it was attempted to compare comprehensively preserving data methods; also general methods such as random data, coding and strong and weak points of each one are examined.

Keywords: data mining, distributed data mining, privacy protection, privacy preserving

Procedia PDF Downloads 516
41280 Analysis of Delivery of Quad Play Services

Authors: Rahul Malhotra, Anurag Sharma

Abstract:

Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.

Keywords: FTTH, quad play, play service, access networks, data rate

Procedia PDF Downloads 400
41279 The Right to Data Portability and Its Influence on the Development of Digital Services

Authors: Roman Bieda

Abstract:

The General Data Protection Regulation (GDPR) will come into force on 25 May 2018 which will create a new legal framework for the protection of personal data in the European Union. Article 20 of GDPR introduces a right to data portability. This right allows for data subjects to receive the personal data which they have provided to a data controller, in a structured, commonly used and machine-readable format, and to transmit this data to another data controller. The right to data portability, by facilitating transferring personal data between IT environments (e.g.: applications), will also facilitate changing the provider of services (e.g. changing a bank or a cloud computing service provider). Therefore, it will contribute to the development of competition and the digital market. The aim of this paper is to discuss the right to data portability and its influence on the development of new digital services.

Keywords: data portability, digital market, GDPR, personal data

Procedia PDF Downloads 467
41278 Songs from the Cradle: An Analysis of Some Selected Nupe Songs

Authors: Zainab Zendana Shafii

Abstract:

Lullabies have been broadly defined as songs that are sung to calm and soothe children. While this is true, this paper intends to show that lullabies exceed these functions. The paper, in exploring Nupe lullabies, examines the various functions that lullabies perform in terms of language development, cultural enrichment and also the retelling of history as it relates to the culture of the Nupe people of northern Nigeria. The theoretical framework used is the functionalist theory. This theory postulates that all cultural or social phenomena have a positive function and that all are indispensable. The functionalist theory is based on the premise that all aspects of a society—institutions, roles, norms, etc.—serve a purpose and that all are indispensable for the long-term survival of the society. To this end, this paper dissects the various lullabies in Nupeland with a view to exploring the meaning that these songs generate and why they are even sung at all. The qualitative research methodology has been used to gather materials.

Keywords: Nupe, lullabies, Nigeria, northern

Procedia PDF Downloads 187
41277 Mobile Learning: Toward Better Understanding of Compression Techniques

Authors: Farouk Lawan Gambo

Abstract:

Data compression shrinks files into fewer bits then their original presentation. It has more advantage on internet because the smaller a file, the faster it can be transferred but learning most of the concepts in data compression are abstract in nature therefore making them difficult to digest by some students (Engineers in particular). To determine the best approach toward learning data compression technique, this paper first study the learning preference of engineering students who tend to have strong active, sensing, visual and sequential learning preferences, the paper also study the advantage that mobility of learning have experienced; Learning at the point of interest, efficiency, connection, and many more. A survey is carried out with some reasonable number of students, through random sampling to see whether considering the learning preference and advantages in mobility of learning will give a promising improvement over the traditional way of learning. Evidence from data analysis using Ms-Excel as a point of concern for error-free findings shows that there is significance different in the students after using learning content provided on smart phone, also the result of the findings presented in, bar charts and pie charts interpret that mobile learning has to be promising feature of learning.

Keywords: data analysis, compression techniques, learning content, traditional learning approach

Procedia PDF Downloads 341
41276 Growth Response of the Fry of Major and Chinese Carp to the Dietary Ingredients in Polyculture System

Authors: Anjum-Zubair, Muhammad, Muhammad Shoaib Alam, Muhammad Samee Mubarik, Iftikhar Ahmad

Abstract:

The aim of present research was to evaluate the effect of dietary protein (soybean) formulated feed on the growth performance of carp fish seed (Rohu, Mori, Grass, and Gulfam) in ponds under polyculture system. Keeping in view the protein requirements of these four carps, they were fed with formulated feed contains 30% of crude protein. The fingerlings were fed once on daily basis at 5% of their wet body weight. A 90 days experiment was conducted in two cemented ponds situated at Fish Seed Hatchery and Research Centre, Rawal Town, Islamabad, Pakistan. Pond1 contain major carps i.e. Rohu and Mori while pond 2 was stocked with Chinese carps i.e. Grass carp and Gulfam. Random sampling of five individuals of each species was done fortnightly to measure the body weight and total body length. Maximum growth was observed in fingerling of Grass carp followed by Mori, Rohu and Gulfam. Total fish production was recorded as Grass 623.45 gm followed by Mori 260.3 gm, Rohu 243.08 gm and Gulfam 181.165 gm respectively. Significantly results were obtained among these four fish species when the corresponding data was subjected to statistical analysis by using two sample t-test. The survival rate was 100%. Study shows that soybean as plant based protein can be easily used as substitute to fish meal without any adverse effect on fish health and fish production.

Keywords: carps, fry growth, poly culture, soybean meal

Procedia PDF Downloads 491
41275 Using RASCAL Code to Analyze the Postulated UF6 Fire Accident

Authors: J. R. Wang, Y. Chiang, W. S. Hsu, S. H. Chen, J. H. Yang, S. W. Chen, C. Shih, Y. F. Chang, Y. H. Huang, B. R. Shen

Abstract:

In this research, the RASCAL code was used to simulate and analyze the postulated UF6 fire accident which may occur in the Institute of Nuclear Energy Research (INER). There are four main steps in this research. In the first step, the UF6 data of INER were collected. In the second step, the RASCAL analysis methodology and model was established by using these data. Third, this RASCAL model was used to perform the simulation and analysis of the postulated UF6 fire accident. Three cases were simulated and analyzed in this step. Finally, the analysis results of RASCAL were compared with the hazardous levels of the chemicals. According to the compared results of three cases, Case 3 has the maximum danger in human health.

Keywords: RASCAL, UF₆, safety, hydrogen fluoride

Procedia PDF Downloads 208
41274 Anomaly Detection in Financial Markets Using Tucker Decomposition

Authors: Salma Krafessi

Abstract:

The financial markets have a multifaceted, intricate environment, and enormous volumes of data are produced every day. To find investment possibilities, possible fraudulent activity, and market oddities, accurate anomaly identification in this data is essential. Conventional methods for detecting anomalies frequently fail to capture the complex organization of financial data. In order to improve the identification of abnormalities in financial time series data, this study presents Tucker Decomposition as a reliable multi-way analysis approach. We start by gathering closing prices for the S&P 500 index across a number of decades. The information is converted to a three-dimensional tensor format, which contains internal characteristics and temporal sequences in a sliding window structure. The tensor is then broken down using Tucker Decomposition into a core tensor and matching factor matrices, allowing latent patterns and relationships in the data to be captured. A possible sign of abnormalities is the reconstruction error from Tucker's Decomposition. We are able to identify large deviations that indicate unusual behavior by setting a statistical threshold. A thorough examination that contrasts the Tucker-based method with traditional anomaly detection approaches validates our methodology. The outcomes demonstrate the superiority of Tucker's Decomposition in identifying intricate and subtle abnormalities that are otherwise missed. This work opens the door for more research into multi-way data analysis approaches across a range of disciplines and emphasizes the value of tensor-based methods in financial analysis.

Keywords: tucker decomposition, financial markets, financial engineering, artificial intelligence, decomposition models

Procedia PDF Downloads 55
41273 Analyzing Keyword Networks for the Identification of Correlated Research Topics

Authors: Thiago M. R. Dias, Patrícia M. Dias, Gray F. Moita

Abstract:

The production and publication of scientific works have increased significantly in the last years, being the Internet the main factor of access and distribution of these works. Faced with this, there is a growing interest in understanding how scientific research has evolved, in order to explore this knowledge to encourage research groups to become more productive. Therefore, the objective of this work is to explore repositories containing data from scientific publications and to characterize keyword networks of these publications, in order to identify the most relevant keywords, and to highlight those that have the greatest impact on the network. To do this, each article in the study repository has its keywords extracted and in this way the network is  characterized, after which several metrics for social network analysis are applied for the identification of the highlighted keywords.

Keywords: bibliometrics, data analysis, extraction and data integration, scientometrics

Procedia PDF Downloads 250
41272 Autonomic Threat Avoidance and Self-Healing in Database Management System

Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik

Abstract:

Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.

Keywords: autonomic computing, self-healing, threat avoidance, security

Procedia PDF Downloads 498
41271 A Novel Heuristic for Analysis of Large Datasets by Selecting Wrapper-Based Features

Authors: Bushra Zafar, Usman Qamar

Abstract:

Large data sample size and dimensions render the effectiveness of conventional data mining methodologies. A data mining technique are important tools for collection of knowledgeable information from variety of databases and provides supervised learning in the form of classification to design models to describe vital data classes while structure of the classifier is based on class attribute. Classification efficiency and accuracy are often influenced to great extent by noisy and undesirable features in real application data sets. The inherent natures of data set greatly masks its quality analysis and leave us with quite few practical approaches to use. To our knowledge first time, we present a new approach for investigation of structure and quality of datasets by providing a targeted analysis of localization of noisy and irrelevant features of data sets. Machine learning is based primarily on feature selection as pre-processing step which offers us to select few features from number of features as a subset by reducing the space according to certain evaluation criterion. The primary objective of this study is to trim down the scope of the given data sample by searching a small set of important features which may results into good classification performance. For this purpose, a heuristic for wrapper-based feature selection using genetic algorithm and for discriminative feature selection an external classifier are used. Selection of feature based on its number of occurrence in the chosen chromosomes. Sample dataset has been used to demonstrate proposed idea effectively. A proposed method has improved average accuracy of different datasets is about 95%. Experimental results illustrate that proposed algorithm increases the accuracy of prediction of different diseases.

Keywords: data mining, generic algorithm, KNN algorithms, wrapper based feature selection

Procedia PDF Downloads 309
41270 Exploring the Role of Data Mining in Crime Classification: A Systematic Literature Review

Authors: Faisal Muhibuddin, Ani Dijah Rahajoe

Abstract:

This in-depth exploration, through a systematic literature review, scrutinizes the nuanced role of data mining in the classification of criminal activities. The research focuses on investigating various methodological aspects and recent developments in leveraging data mining techniques to enhance the effectiveness and precision of crime categorization. Commencing with an exposition of the foundational concepts of crime classification and its evolutionary dynamics, this study details the paradigm shift from conventional methods towards approaches supported by data mining, addressing the challenges and complexities inherent in the modern crime landscape. Specifically, the research delves into various data mining techniques, including K-means clustering, Naïve Bayes, K-nearest neighbour, and clustering methods. A comprehensive review of the strengths and limitations of each technique provides insights into their respective contributions to improving crime classification models. The integration of diverse data sources takes centre stage in this research. A detailed analysis explores how the amalgamation of structured data (such as criminal records) and unstructured data (such as social media) can offer a holistic understanding of crime, enriching classification models with more profound insights. Furthermore, the study explores the temporal implications in crime classification, emphasizing the significance of considering temporal factors to comprehend long-term trends and seasonality. The availability of real-time data is also elucidated as a crucial element in enhancing responsiveness and accuracy in crime classification.

Keywords: data mining, classification algorithm, naïve bayes, k-means clustering, k-nearest neigbhor, crime, data analysis, sistematic literature review

Procedia PDF Downloads 59
41269 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 282
41268 Hybrid Approach for Country’s Performance Evaluation

Authors: C. Slim

Abstract:

This paper presents an integrated model, which hybridized data envelopment analysis (DEA) and support vector machine (SVM) together, to class countries according to their efficiency and performance. This model takes into account aspects of multi-dimensional indicators, decision-making hierarchy and relativity of measurement. Starting from a set of indicators of performance as exhaustive as possible, a process of successive aggregations has been developed to attain an overall evaluation of a country’s competitiveness.

Keywords: Artificial Neural Networks (ANN), Support vector machine (SVM), Data Envelopment Analysis (DEA), Aggregations, indicators of performance

Procedia PDF Downloads 332
41267 Management of Renal Malignancies with IVC Thrombus: Our Experience

Authors: Sujeet Poudyal

Abstract:

Introduction: Renal cell carcinoma is the most common malignancy associated with Inferior vena cava (IVC) thrombosis. Radical nephrectomy with tumor thrombectomy provides durable cancer-free survival. Other renal malignancies like Wilms’ tumors are also associated with IVC thrombus. We describe our experience with the management of renal malignancies associated with IVC thrombus. Methods: This prospective study included 28 patients undergoing surgery for renal malignancies associated with IVC thrombus from February 2017 to March 2023. Demographics of patients, types of renal malignancy, level of IVC thrombus, intraoperative details, need for venovenous bypass, cardiopulmonary bypass and postoperative outcomes were all documented. Results: Out of a total of 28 patients, 24 patients had clear cell Renal Cell Carcinoma,1 had renal osteosarcoma and 3 patients had Wilms tumor. The levels. of thrombus were II in eight, III in seven, and IV in six patients. The mean age of RCC was 62.81±10.2 years, renal osteosarcoma was 26 years and Wilms tumor was 23 years. There was a need for venovenous bypass in four patients and cardiopulmonary bypass in four patients, and the Postoperative period was uneventful in most cases except for two mortalities, one in Level III due to pneumonia and one in Level IV due to sepsis. All cases followed up till now have no local recurrence and metastasis except one case of RCC with Level IV IVC thrombus, which presented with paraaortic nodal recurrence and is currently managed with sunitinib. Conclusion: The complexity in the management of renal malignancy with IVC thrombus increases with the level of IVC thrombus. As radical nephrectomy with tumor thrombectomy provides durable cancer-free survival in most cases, the surgery should be undertaken in an expert and experienced setup with a strong cardiovascular backup to minimize morbidity and mortality associated with the procedure.

Keywords: renal malignancy, IVC thrombus, radical nephrectomy with tumor thrombectomy, renal cell carcinoma

Procedia PDF Downloads 57