Search results for: algorithms and data structure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32317

Search results for: algorithms and data structure

30517 Effect of Air Gap Distance on the Structure of PVDF Hollow Fiber Membrane Contactors for Physical CO2 Absorption

Authors: J. Shiri, A. Mansourizadeh, F. Faghih, H. Vaez

Abstract:

In this study, porous polyvinylidene fluoride (PVDF) hollow fiber membranes are fabricated via a wet phase-inversion Process and used in the gas–liquid membrane contactor for physical CO2 absorption. Effect of different air gap on the structure and CO2 flux of the membrane was investigated. The hollow fibers were prepared using the wet spinning process using a dope solution containing PVDF/NMP/Licl (18%, 78%, 4%) at the extrusion rate of 4.5ml/min and air gaps of 0, 7, 15cm. Water was used as internal and external coagulants. Membranes were characterized using various techniques such as Field Emission Scanning Electron Microscopy (FESEM), Gas permeation test, Critical Water Entry Pressure (CEPw) to select the best membrane structure for Co2 absorption. The characterization results showed that the prepared membrane at which air gap possess small pore size with high surface porosity and wetting resistance, which are favorable for gas absorption application air gap increased, CEPw had a decrease, but the N2 permeation was decreased. Surface porosity and also Co2 absorption was increased.

Keywords: porous PVDF hollow fiber membrane, CO2 absorption, phase inversion, air gap

Procedia PDF Downloads 396
30516 Factor Structure of the University of California, Los Angeles (UCLA) Loneliness Scale: Gender, Age, and Marital Status Differences

Authors: Hamzeh Dodeen

Abstract:

This study aims at examining the effects of item wording effects on the factor structure of the University of California, Los Angeles (UCLA) Loneliness Scale: gender, age, and marital status differences. A total of 2374 persons from the UAE participated, representing six different populations (teenagers/elderly, males/females, and married/unmarried). The results of the exploratory factor analysis using principal axis factoring with (oblique) rotation revealed that two factors were extracted from the 20 items of the scale. The nine positively worded items were highly loaded on the first factor, while 10 out of the 11 negatively worded items were highly loaded on the second factor. The two-factor solution was confirmed on the six different populations based on age, gender, and marital status. It has been concluded that the rating of the UCLA scale is affected by a response style related to the item wording.

Keywords: UCLA Loneliness Scale, loneliness, positively worded items, factor structure, negatively worded items

Procedia PDF Downloads 356
30515 The Economic Limitations of Defining Data Ownership Rights

Authors: Kacper Tomasz Kröber-Mulawa

Abstract:

This paper will address the topic of data ownership from an economic perspective, and examples of economic limitations of data property rights will be provided, which have been identified using methods and approaches of economic analysis of law. To properly build a background for the economic focus, in the beginning a short perspective of data and data ownership in the EU’s legal system will be provided. It will include a short introduction to its political and social importance and highlight relevant viewpoints. This will stress the importance of a Single Market for data but also far-reaching regulations of data governance and privacy (including the distinction of personal and non-personal data, data held by public bodies and private businesses). The main discussion of this paper will build upon the briefly referred to legal basis as well as methods and approaches of economic analysis of law.

Keywords: antitrust, data, data ownership, digital economy, property rights

Procedia PDF Downloads 89
30514 Protecting the Cloud Computing Data Through the Data Backups

Authors: Abdullah Alsaeed

Abstract:

Virtualized computing and cloud computing infrastructures are no longer fuzz or marketing term. They are a core reality in today’s corporate Information Technology (IT) organizations. Hence, developing an effective and efficient methodologies for data backup and data recovery is required more than any time. The purpose of data backup and recovery techniques are to assist the organizations to strategize the business continuity and disaster recovery approaches. In order to accomplish this strategic objective, a variety of mechanism were proposed in the recent years. This research paper will explore and examine the latest techniques and solutions to provide data backup and restoration for the cloud computing platforms.

Keywords: data backup, data recovery, cloud computing, business continuity, disaster recovery, cost-effective, data encryption.

Procedia PDF Downloads 95
30513 Molecular Characterization of Polyploid Bamboo (Dendrocalamus hamiltonii) Using Microsatellite Markers

Authors: Rajendra K. Meena, Maneesh S. Bhandari, Santan Barthwal, Harish S. Ginwal

Abstract:

Microsatellite markers are the most valuable tools for the characterization of plant genetic resources or population genetic analysis. Since it is codominant and allelic markers, utilizing them in polyploid species remained doubtful. In such cases, the microsatellite marker is usually analyzed by treating them as a dominant marker. In the current study, it has been showed that despite losing the advantage of co-dominance, microsatellite markers are still a powerful tool for genotyping of polyploid species because of availability of large number of reproducible alleles per locus. It has been studied by genotyping of 19 subpopulations of Dendrocalamus hamiltonii (hexaploid bamboo species) with 17 polymorphic simple sequence repeat (SSR) primer pairs. Among these, ten primers gave typical banding pattern of microsatellite marker as expected in diploid species, but rest 7 gave an unusual pattern, i.e., more than two bands per locus per genotype. In such case, genotyping data are generally analyzed by considering as dominant markers. In the current study, data were analyzed in both ways as dominant and co-dominant. All the 17 primers were first scored as nonallelic data and analyzed; later, the ten primers giving standard banding patterns were analyzed as allelic data and the results were compared. The UPGMA clustering and genetic structure showed that results obtained with both the data sets are very similar with slight variation, and therefore the SSR marker could be utilized to characterize polyploid species by considering them as a dominant marker. The study is highly useful to widen the scope for SSR markers applications and beneficial to the researchers dealing with polyploid species.

Keywords: microsatellite markers, Dendrocalamus hamiltonii, dominant and codominant, polyploids

Procedia PDF Downloads 149
30512 Chemometric Estimation of Inhibitory Activity of Benzimidazole Derivatives by Linear Least Squares and Artificial Neural Networks Modelling

Authors: Sanja O. Podunavac-Kuzmanović, Strahinja Z. Kovačević, Lidija R. Jevrić, Stela Jokić

Abstract:

The subject of this paper is to correlate antibacterial behavior of benzimidazole derivatives with their molecular characteristics using chemometric QSAR (Quantitative Structure–Activity Relationships) approach. QSAR analysis has been carried out on the inhibitory activity of benzimidazole derivatives against Staphylococcus aureus. The data were processed by linear least squares (LLS) and artificial neural network (ANN) procedures. The LLS mathematical models have been developed as a calibration models for prediction of the inhibitory activity. The quality of the models was validated by leave one out (LOO) technique and by using external data set. High agreement between experimental and predicted inhibitory acivities indicated the good quality of the derived models. These results are part of the CMST COST Action No. CM1306 "Understanding Movement and Mechanism in Molecular Machines".

Keywords: Antibacterial, benzimidazoles, chemometric, QSAR.

Procedia PDF Downloads 321
30511 Fraud Detection in Credit Cards with Machine Learning

Authors: Anjali Chouksey, Riya Nimje, Jahanvi Saraf

Abstract:

Online transactions have increased dramatically in this new ‘social-distancing’ era. With online transactions, Fraud in online payments has also increased significantly. Frauds are a significant problem in various industries like insurance companies, baking, etc. These frauds include leaking sensitive information related to the credit card, which can be easily misused. Due to the government also pushing online transactions, E-commerce is on a boom. But due to increasing frauds in online payments, these E-commerce industries are suffering a great loss of trust from their customers. These companies are finding credit card fraud to be a big problem. People have started using online payment options and thus are becoming easy targets of credit card fraud. In this research paper, we will be discussing machine learning algorithms. We have used a decision tree, XGBOOST, k-nearest neighbour, logistic-regression, random forest, and SVM on a dataset in which there are transactions done online mode using credit cards. We will test all these algorithms for detecting fraud cases using the confusion matrix, F1 score, and calculating the accuracy score for each model to identify which algorithm can be used in detecting frauds.

Keywords: machine learning, fraud detection, artificial intelligence, decision tree, k nearest neighbour, random forest, XGBOOST, logistic regression, support vector machine

Procedia PDF Downloads 153
30510 Customer Data Analysis Model Using Business Intelligence Tools in Telecommunication Companies

Authors: Monica Lia

Abstract:

This article presents a customer data analysis model using business intelligence tools for data modelling, transforming, data visualization and dynamic reports building. Economic organizational customer’s analysis is made based on the information from the transactional systems of the organization. The paper presents how to develop the data model starting for the data that companies have inside their own operational systems. The owned data can be transformed into useful information about customers using business intelligence tool. For a mature market, knowing the information inside the data and making forecast for strategic decision become more important. Business Intelligence tools are used in business organization as support for decision-making.

Keywords: customer analysis, business intelligence, data warehouse, data mining, decisions, self-service reports, interactive visual analysis, and dynamic dashboards, use cases diagram, process modelling, logical data model, data mart, ETL, star schema, OLAP, data universes

Procedia PDF Downloads 437
30509 Optimized Control of Roll Stability of Missile using Genetic Algorithm

Authors: Pham Van Hung, Nguyen Trong Hieu, Le Quoc Dinh, Nguyen Kiem Chien, Le Dinh Hieu

Abstract:

The article focuses on the study of automatic flight control on missiles during operation. The quality standards and characteristics of missile operations are very strict, requiring high stability and accurate response to commands within a relatively wide range of work. The study analyzes the linear transfer function model of the Missile Roll channel to facilitate the development of control systems. A two-loop control structure for the Missile Roll channel is proposed, with the inner loop controlling the Missile Roll rate and the outer loop controlling the Missile Roll angle. To determine the optimal control parameters, a genetic algorithm is applied. The study uses MATLAB simulation software to implement the genetic algorithm and evaluate the quality of the closed-loop system. The results show that the system achieves better quality than the original structure and is simple, reliable, and ready for implementation in practical experiments.

Keywords: genetic algorithm, roll chanel, two-loop control structure, missile

Procedia PDF Downloads 94
30508 Gas-Liquid Two Phase Flow Phenomenon in Near Horizontal Upward and Downward Inclined Pipe Orientations

Authors: Afshin J. Ghajar, Swanand M. Bhagwat

Abstract:

The main purpose of this work is to experimentally investigate the effect of pipe orientation on two phase flow phenomenon. Flow pattern, void fraction and two phase pressure drop is measured in a polycarbonate pipe with an inside diameter of 12.7mm for inclination angles ranging from -20° to +20° using air-water fluid combination. The experimental data covers all flow patterns and the entire range of void fraction typically observed in two phase flow. The effect of pipe orientation on void fraction and two phase pressure drop is justified with reference to the change in flow structure and two phase flow behavior. In addition to this, the top performing void fraction and two phase pressure drop correlations available in the literature are presented and their performance is assessed against the experimental data in the present study and that available in the literature.

Keywords: flow patterns, inclined two phase flow, pressure drop, void fraction

Procedia PDF Downloads 686
30507 Cluster Analysis and Benchmarking for Performance Optimization of a Pyrochlore Processing Unit

Authors: Ana C. R. P. Ferreira, Adriano H. P. Pereira

Abstract:

Given the frequent variation of mineral properties throughout the Araxá pyrochlore deposit, even if a good homogenization work has been carried out before feeding the processing plants, an operation with quality and performance’s high variety standard is expected. These results could be improved and standardized if the blend composition parameters that most influence the processing route are determined, and then the types of raw materials are grouped by them, finally presenting a great reference with operational settings for each group. Associating the physical and chemical parameters of a unit operation through benchmarking or even an optimal reference of metallurgical recovery and product quality reflects in the reduction of the production costs, optimization of the mineral resource, and guarantee of greater stability in the subsequent processes of the production chain that uses the mineral of interest. Conducting a comprehensive exploratory data analysis to identify which characteristics of the ore are most relevant to the process route, associated with the use of Machine Learning algorithms for grouping the raw material (ore) and associating these with reference variables in the process’ benchmark is a reasonable alternative for the standardization and improvement of mineral processing units. Clustering methods through Decision Tree and K-Means were employed, associated with algorithms based on the theory of benchmarking, with criteria defined by the process team in order to reference the best adjustments for processing the ore piles of each cluster. A clean user interface was created to obtain the outputs of the created algorithm. The results were measured through the average time of adjustment and stabilization of the process after a new pile of homogenized ore enters the plant, as well as the average time needed to achieve the best processing result. Direct gains from the metallurgical recovery of the process were also measured. The results were promising, with a reduction in the adjustment time and stabilization when starting the processing of a new ore pile, as well as reaching the benchmark. Also noteworthy are the gains in metallurgical recovery, which reflect a significant saving in ore consumption and a consequent reduction in production costs, hence a more rational use of the tailings dams and life optimization of the mineral deposit.

Keywords: mineral clustering, machine learning, process optimization, pyrochlore processing

Procedia PDF Downloads 149
30506 An Intelligent Search and Retrieval System for Mining Clinical Data Repositories Based on Computational Imaging Markers and Genomic Expression Signatures for Investigative Research and Decision Support

Authors: David J. Foran, Nhan Do, Samuel Ajjarapu, Wenjin Chen, Tahsin Kurc, Joel H. Saltz

Abstract:

The large-scale data and computational requirements of investigators throughout the clinical and research communities demand an informatics infrastructure that supports both existing and new investigative and translational projects in a robust, secure environment. In some subspecialties of medicine and research, the capacity to generate data has outpaced the methods and technology used to aggregate, organize, access, and reliably retrieve this information. Leading health care centers now recognize the utility of establishing an enterprise-wide, clinical data warehouse. The primary benefits that can be realized through such efforts include cost savings, efficient tracking of outcomes, advanced clinical decision support, improved prognostic accuracy, and more reliable clinical trials matching. The overarching objective of the work presented here is the development and implementation of a flexible Intelligent Retrieval and Interrogation System (IRIS) that exploits the combined use of computational imaging, genomics, and data-mining capabilities to facilitate clinical assessments and translational research in oncology. The proposed System includes a multi-modal, Clinical & Research Data Warehouse (CRDW) that is tightly integrated with a suite of computational and machine-learning tools to provide insight into the underlying tumor characteristics that are not be apparent by human inspection alone. A key distinguishing feature of the System is a configurable Extract, Transform and Load (ETL) interface that enables it to adapt to different clinical and research data environments. This project is motivated by the growing emphasis on establishing Learning Health Systems in which cyclical hypothesis generation and evidence evaluation become integral to improving the quality of patient care. To facilitate iterative prototyping and optimization of the algorithms and workflows for the System, the team has already implemented a fully functional Warehouse that can reliably aggregate information originating from multiple data sources including EHR’s, Clinical Trial Management Systems, Tumor Registries, Biospecimen Repositories, Radiology PAC systems, Digital Pathology archives, Unstructured Clinical Documents, and Next Generation Sequencing services. The System enables physicians to systematically mine and review the molecular, genomic, image-based, and correlated clinical information about patient tumors individually or as part of large cohorts to identify patterns that may influence treatment decisions and outcomes. The CRDW core system has facilitated peer-reviewed publications and funded projects, including an NIH-sponsored collaboration to enhance the cancer registries in Georgia, Kentucky, New Jersey, and New York, with machine-learning based classifications and quantitative pathomics, feature sets. The CRDW has also resulted in a collaboration with the Massachusetts Veterans Epidemiology Research and Information Center (MAVERIC) at the U.S. Department of Veterans Affairs to develop algorithms and workflows to automate the analysis of lung adenocarcinoma. Those studies showed that combining computational nuclear signatures with traditional WHO criteria through the use of deep convolutional neural networks (CNNs) led to improved discrimination among tumor growth patterns. The team has also leveraged the Warehouse to support studies to investigate the potential of utilizing a combination of genomic and computational imaging signatures to characterize prostate cancer. The results of those studies show that integrating image biomarkers with genomic pathway scores is more strongly correlated with disease recurrence than using standard clinical markers.

Keywords: clinical data warehouse, decision support, data-mining, intelligent databases, machine-learning.

Procedia PDF Downloads 136
30505 Sync Consensus Algorithm: Trying to Reach an Agreement at Full Speed

Authors: Yuri Zinchenko

Abstract:

Recently, distributed storage systems have been used more and more in various aspects of everyday life. They provide such necessary properties as Scalability, Fault Tolerance, Durability, and others. At the same time, not only reliable but also fast data storage remains one of the most pressing issues in this area. That brings us to the consensus algorithm as one of the most important components that has a great impact on the functionality of a distributed system. This paper is the result of an analysis of several well-known consensus algorithms, such as Paxos and Raft. The algorithm it offers, called Sync, promotes, but does not insist on simultaneous writing to the nodes (which positively affects the overall writing speed) and tries to minimize the system's inactive time. This allows nodes to reach agreement on the system state in a shorter period, which is a critical factor for distributed systems. Also when developing Sync, a lot of attention was paid to such criteria as simplicity and intuitiveness, the importance of which is difficult to overestimate.

Keywords: sync, consensus algorithm, distributed system, leader-based, synchronization.

Procedia PDF Downloads 67
30504 Optimal and Critical Path Analysis of State Transportation Network Using Neo4J

Authors: Pallavi Bhogaram, Xiaolong Wu, Min He, Onyedikachi Okenwa

Abstract:

A transportation network is a realization of a spatial network, describing a structure which permits either vehicular movement or flow of some commodity. Examples include road networks, railways, air routes, pipelines, and many more. The transportation network plays a vital role in maintaining the vigor of the nation’s economy. Hence, ensuring the network stays resilient all the time, especially in the face of challenges such as heavy traffic loads and large scale natural disasters, is of utmost importance. In this paper, we used the Neo4j application to develop the graph. Neo4j is the world's leading open-source, NoSQL, a native graph database that implements an ACID-compliant transactional backend to applications. The Southern California network model is developed using the Neo4j application and obtained the most critical and optimal nodes and paths in the network using centrality algorithms. The edge betweenness centrality algorithm calculates the critical or optimal paths using Yen's k-shortest paths algorithm, and the node betweenness centrality algorithm calculates the amount of influence a node has over the network. The preliminary study results confirm that the Neo4j application can be a suitable tool to study the important nodes and the critical paths for the major congested metropolitan area.

Keywords: critical path, transportation network, connectivity reliability, network model, Neo4j application, edge betweenness centrality index

Procedia PDF Downloads 140
30503 Design of a Small and Medium Enterprise Growth Prediction Model Based on Web Mining

Authors: Yiea Funk Te, Daniel Mueller, Irena Pletikosa Cvijikj

Abstract:

Small and medium enterprises (SMEs) play an important role in the economy of many countries. When the overall world economy is considered, SMEs represent 95% of all businesses in the world, accounting for 66% of the total employment. Existing studies show that the current business environment is characterized as highly turbulent and strongly influenced by modern information and communication technologies, thus forcing SMEs to experience more severe challenges in maintaining their existence and expanding their business. To support SMEs at improving their competitiveness, researchers recently turned their focus on applying data mining techniques to build risk and growth prediction models. However, data used to assess risk and growth indicators is primarily obtained via questionnaires, which is very laborious and time-consuming, or is provided by financial institutes, thus highly sensitive to privacy issues. Recently, web mining (WM) has emerged as a new approach towards obtaining valuable insights in the business world. WM enables automatic and large scale collection and analysis of potentially valuable data from various online platforms, including companies’ websites. While WM methods have been frequently studied to anticipate growth of sales volume for e-commerce platforms, their application for assessment of SME risk and growth indicators is still scarce. Considering that a vast proportion of SMEs own a website, WM bears a great potential in revealing valuable information hidden in SME websites, which can further be used to understand SME risk and growth indicators, as well as to enhance current SME risk and growth prediction models. This study aims at developing an automated system to collect business-relevant data from the Web and predict future growth trends of SMEs by means of WM and data mining techniques. The envisioned system should serve as an 'early recognition system' for future growth opportunities. In an initial step, we examine how structured and semi-structured Web data in governmental or SME websites can be used to explain the success of SMEs. WM methods are applied to extract Web data in a form of additional input features for the growth prediction model. The data on SMEs provided by a large Swiss insurance company is used as ground truth data (i.e. growth-labeled data) to train the growth prediction model. Different machine learning classification algorithms such as the Support Vector Machine, Random Forest and Artificial Neural Network are applied and compared, with the goal to optimize the prediction performance. The results are compared to those from previous studies, in order to assess the contribution of growth indicators retrieved from the Web for increasing the predictive power of the model.

Keywords: data mining, SME growth, success factors, web mining

Procedia PDF Downloads 271
30502 Emotion Detection in Twitter Messages Using Combination of Long Short-Term Memory and Convolutional Deep Neural Networks

Authors: Bahareh Golchin, Nooshin Riahi

Abstract:

One of the most significant issues as attended a lot in recent years is that of recognizing the sentiments and emotions in social media texts. The analysis of sentiments and emotions is intended to recognize the conceptual information such as the opinions, feelings, attitudes and emotions of people towards the products, services, organizations, people, topics, events and features in the written text. These indicate the greatness of the problem space. In the real world, businesses and organizations are always looking for tools to gather ideas, emotions, and directions of people about their products, services, or events related to their own. This article uses the Twitter social network, one of the most popular social networks with about 420 million active users, to extract data. Using this social network, users can share their information and opinions about personal issues, policies, products, events, etc. It can be used with appropriate classification of emotional states due to the availability of its data. In this study, supervised learning and deep neural network algorithms are used to classify the emotional states of Twitter users. The use of deep learning methods to increase the learning capacity of the model is an advantage due to the large amount of available data. Tweets collected on various topics are classified into four classes using a combination of two Bidirectional Long Short Term Memory network and a Convolutional network. The results obtained from this study with an average accuracy of 93%, show good results extracted from the proposed framework and improved accuracy compared to previous work.

Keywords: emotion classification, sentiment analysis, social networks, deep neural networks

Procedia PDF Downloads 143
30501 Gum Arabic-Coated Magnetic Nanoparticles for Methylene Blue Removal

Authors: Eman Alzahrani

Abstract:

Magnetic nanoparticles (MNPs) were fabricated using the chemical co-precipitation method followed by coating the surface of magnetic Fe3O4 nanoparticles with gum arabic (GA). The fabricated magnetic nanoparticles were characterised using transmission electron microscopy (TEM) which showed that the Fe3O4 nanoparticles and GA-MNPs nanoparticles had a mean diameter of 33 nm, and 38 nm, respectively. Scanning electron microscopy (SEM) images showed that the MNPs modified with GA had homogeneous structure and agglomerated. The energy dispersive X-ray spectroscopy (EDAX) spectrum showed strong peaks of Fe and O. X-ray diffraction patterns (XRD) indicated that the naked magnetic nanoparticles were pure Fe3O4 with a spinel structure and the covering of GA did not result in a phase change. The covering of GA on the magnetic nanoparticles was also studied by BET analysis, and Fourier transform infrared spectroscopy. Moreover, the present study reports a fast and simple method for removal and recovery of methylene blue dye (MB) from aqueous solutions by using the synthesised magnetic nanoparticles modified with gum arabic as adsorbent. The experimental results show that the adsorption process attains equilibrium within five minutes. The data fit the Langmuir isotherm equation and the maximum adsorption capacities were 8.77 mg mg-1 and 14.3 mg mg-1 for MNPs and GA-MNPs, respectively. The results indicated that the homemade magnetic nanoparticles were quite efficient for removing MB and will be a promising adsorbent for the removal of harmful dyes from waste-water.

Keywords: Fe3O4 magnetic nanoparticles, gum arabic, co-precipitation, adsorption dye, methylene blue, adsorption isotherm

Procedia PDF Downloads 437
30500 The Response of Mammal Populations to Abrupt Changes in Fire Regimes in Montane Landscapes of South-Eastern Australia

Authors: Jeremy Johnson, Craig Nitschke, Luke Kelly

Abstract:

Fire regimes, climate and topographic gradients interact to influence ecosystem structure and function across fire-prone, montane landscapes worldwide. Biota have developed a range of adaptations to historic fire regime thresholds, which allow them to persist in these environments. In south-eastern Australia, a signal of fire regime changes is emerging across these landscapes, and anthropogenic climate change is likely to be one of the main drivers of an increase in burnt area and more frequent wildfire over the last 25 years. This shift has the potential to modify vegetation structure and composition at broad scales, which may lead to landscape patterns to which biota are not adapted, increasing the likelihood of local extirpation of some mammal species. This study aimed to address concerns related to the influence of abrupt changes in fire regimes on mammal populations in montane landscapes. It first examined the impact of climate, topography, and vegetation on fire patterns and then explored the consequences of these changes on mammal populations and their habitats. Field studies were undertaken across diverse vegetation, fire severity and fire frequency gradients, utilising camera trapping and passive acoustic monitoring methodologies and the collection of fine-scale vegetation data. Results show that drought is a primary contributor to fire regime shifts at the landscape scale, while topographic factors have a variable influence on wildfire occurrence at finer scales. Frequent, high severity wildfire influenced forest structure and composition at broad spatial scales, and at fine scales, it reduced occurrence of hollow-bearing trees and promoted coarse woody debris. Mammals responded differently to shifts in forest structure and composition depending on their habitat requirements. This study highlights the complex interplay between fire regimes, environmental gradients, and biotic adaptations across temporal and spatial scales. It emphasizes the importance of understanding complex interactions to effectively manage fire-prone ecosystems in the face of climate change.

Keywords: fire, ecology, biodiversity, landscape ecology

Procedia PDF Downloads 78
30499 Analysis and Design of Irregular Large Cantilever Structure of Statue

Authors: Pan Rui, Ma Jun, Zhao Caiqi, Wang Guangda

Abstract:

With the development of the tourism and religion,more and more large statue structures are adopted to build all over the world.For instance,the GuanYin statue with three plane reaches 108 meters high in HaiNan province in China.These statue structures belong to typical high-rise Building. However,the geometry sculpt of statues are complicated .The irregular shape makes these structures more complicated in force analysis than those normal standard tall buildings.In this paper,the Liu Bang Statue which is located at XuZhou in China.

Keywords: large statue structure, special-shaped steel, GuanYin statue, China

Procedia PDF Downloads 398
30498 Opening up Government Datasets for Big Data Analysis to Support Policy Decisions

Authors: K. Hardy, A. Maurushat

Abstract:

Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.

Keywords: big data, open data, productivity, data governance

Procedia PDF Downloads 375
30497 Influence of Existing Foundations on Soil-Structure Interaction of New Foundations in a Reconstruction Project

Authors: Kanagarajah Ravishankar

Abstract:

This paper describes a study performed for a project featuring an elevated steel bridge structure supported by various types of foundation systems. This project focused on rehabilitation or redesign of a portion of the bridge substructures founded on caisson foundations. The study that this paper focuses on is the evaluation of foundation and soil stiffnesses and interactions between the existing caissons and proposed foundations. The caisson foundations were founded on top of rock, where the depth to the top of rock varies from approximately 50 to 140 feet below ground surface. Based on a comprehensive investigation of the existing piers and caissons, the presence of ASR was suspected from observed whitish deposits on cracked surfaces as well as internal damages sustained through the entire depth of foundation structures. Reuse of existing piers and caissons was precluded and deemed unsuitable under the earthquake condition because of these defects on the structures. The proposed design of new foundations and substructures which was selected ultimately neglected the contribution from the existing caisson and pier columns. Due to the complicated configuration between the existing caisson and the proposed foundation system, three-dimensional finite element method (FEM) was employed to evaluate soil-structure interaction (SSI), to evaluate the effect of the existing caissons on the proposed foundations, and to compare the results with conventional group analysis. The FEM models include separate models for existing caissons, proposed foundations, and combining both.

Keywords: soil-structure interaction, foundation stiffness, finite element, seismic design

Procedia PDF Downloads 142
30496 Symbolic Computation via Grobner Basis

Authors: Haohao Wang

Abstract:

The purpose of this paper is to find elimination ideals via Grobner basis. We first introduce the concept of Grobner bases, and then, we provide computational algorithms to applications for curves and surfaces.

Keywords: curves, surfaces, Grobner basis, elimination

Procedia PDF Downloads 302
30495 Study of the Effect of Rotation on the Deformation of a Flexible Blade Rotor

Authors: Aref Maalej, Marwa Fakhfakh, Wael Ben Amira

Abstract:

We present in this work a numerical investigation of fluid-structure interaction to study the elastic behavior of flexible rotors. The principal aim is to provide the effect of the aero/hydrodynamic parameters on the bending deformation of flexible rotors. This study is accomplished using the strong two-way fluid-structure interaction (FSI) developed by the ANSYS Workbench software. This method is used for coupling the fluid solver to the transient structural solver to study the elastic behavior of flexible rotors in water. In this study, we use a moderately flexible rotor modeled by a single blade with simplified rectangular geometry. In this work, we focus on the effect of the rotational frequency on the flapwise bending deformation. It is demonstrated that the blade deforms in the downstream direction, and the amplitude of these deformations increases with the rotational frequencies. Also, from a critical frequency, the blade begins to deform in the upstream direction.

Keywords: numerical simulation, flexible blade, fluid-structure interaction, ANSYS workbench, flapwise deformation

Procedia PDF Downloads 92
30494 Optimization of Hate Speech and Abusive Language Detection on Indonesian-language Twitter using Genetic Algorithms

Authors: Rikson Gultom

Abstract:

Hate Speech and Abusive language on social media is difficult to detect, usually, it is detected after it becomes viral in cyberspace, of course, it is too late for prevention. An early detection system that has a fairly good accuracy is needed so that it can reduce conflicts that occur in society caused by postings on social media that attack individuals, groups, and governments in Indonesia. The purpose of this study is to find an early detection model on Twitter social media using machine learning that has high accuracy from several machine learning methods studied. In this study, the support vector machine (SVM), Naïve Bayes (NB), and Random Forest Decision Tree (RFDT) methods were compared with the Support Vector machine with genetic algorithm (SVM-GA), Nave Bayes with genetic algorithm (NB-GA), and Random Forest Decision Tree with Genetic Algorithm (RFDT-GA). The study produced a comparison table for the accuracy of the hate speech and abusive language detection model, and presented it in the form of a graph of the accuracy of the six algorithms developed based on the Indonesian-language Twitter dataset, and concluded the best model with the highest accuracy.

Keywords: abusive language, hate speech, machine learning, optimization, social media

Procedia PDF Downloads 131
30493 Pushover Analysis of Reinforced Concrete Buildings Using Full Jacket Technics: A Case Study on an Existing Old Building in Madinah

Authors: Tarek M. Alguhane, Ayman H. Khalil, M. N. Fayed, Ayman M. Ismail

Abstract:

The retrofitting of existing buildings to resist the seismic loads is very important to avoid losing lives or financial disasters. The aim at retrofitting processes is increasing total structure strength by increasing stiffness or ductility ratio. In addition, the response modification factors (R) have to satisfy the code requirements for suggested retrofitting types. In this study, two types of jackets are used, i.e. full reinforced concrete jackets and surrounding steel plate jackets. The study is carried out on an existing building in Madinah by performing static pushover analysis before and after retrofitting the columns. The selected model building represents nearly all-typical structure lacks structure built before 30 years ago in Madina City, KSA. The comparison of the results indicates a good enhancement of the structure respect to the applied seismic forces. Also, the response modification factor of the RC building is evaluated for the studied cases before and after retrofitting. The design of all vertical elements (columns) is given. The results show that the design of retrofitted columns satisfied the code's design stress requirements. However, for some retrofitting types, the ductility requirements represented by response modification factor do not satisfy KSA design code (SBC- 301).

Keywords: concrete jackets, steel jackets, RC buildings, pushover analysis, non-Linear analysis

Procedia PDF Downloads 368
30492 A Review on Existing Challenges of Data Mining and Future Research Perspectives

Authors: Hema Bhardwaj, D. Srinivasa Rao

Abstract:

Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.

Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges

Procedia PDF Downloads 114
30491 Study of the Nanostructured Fe₅₀Cr₃₅Ni₁₅ Powder Alloy Developed by Mechanical Alloying

Authors: Salim Triaa, Fella Kali-Ali

Abstract:

Nanostructured Fe₅₀Cr3₃₅Ni₁₅ alloys were prepared from pure elemental powders using high energy mechanical alloying. The mixture powders obtained are characterized by several techniques. X-ray diffraction analysis revelated the formation of the Fe₁Cr₁ compound with BBC structure after one hour of milling. A second compound Fe₃Ni₂ with FCC structure was observed after 12 hours of milling. The size of crystallite determined by Williamson Hall method was about 5.1 nm after 48h of mill. SEM observations confirmed the growth of crushed particles as a function of milling time, while the homogenization of our powders into different constituent elements was verified by the EDX analysis.

Keywords: Fe-Cr-Ni alloy, mechanical alloying, nanostructure, SEM, XRD

Procedia PDF Downloads 180
30490 An Automatic Feature Extraction Technique for 2D Punch Shapes

Authors: Awais Ahmad Khan, Emad Abouel Nasr, H. M. A. Hussein, Abdulrahman Al-Ahmari

Abstract:

Sheet-metal parts have been widely applied in electronics, communication and mechanical industries in recent decades; but the advancement in sheet-metal part design and manufacturing is still behind in comparison with the increasing importance of sheet-metal parts in modern industry. This paper presents a methodology for automatic extraction of some common 2D internal sheet metal features. The features used in this study are taken from Unipunch ™ catalogue. The extraction process starts with the data extraction from STEP file using an object oriented approach and with the application of suitable algorithms and rules, all features contained in the catalogue are automatically extracted. Since the extracted features include geometry and engineering information, they will be effective for downstream application such as feature rebuilding and process planning.

Keywords: feature extraction, internal features, punch shapes, sheet metal

Procedia PDF Downloads 622
30489 Design and Fabrication of a Smart Quadruped Robot

Authors: Shivani Verma, Amit Agrawal, Pankaj Kumar Meena, Ashish B. Deoghare

Abstract:

Over the decade robotics has been a major area of interest among the researchers and scientists in reducing human efforts. The need for robots to replace human work in different dangerous fields such as underground mining, nuclear power station and war against terrorist attack has gained huge attention. Most of the robot design is based on human structure popularly known as humanoid robots. However, the problems encountered in humanoid robots includes low speed of movement, misbalancing in structure, poor load carrying capacity, etc. The simplification and adaptation of the fundamental design principles seen in animals have led to the creation of bio-inspired robots. But the major challenges observed in naturally inspired robot include complexity in structure, several degrees of freedom and energy storage problem. The present work focuses on design and fabrication of a bionic quadruped walking robot which is based on different joint of quadruped mammals like a dog, cheetah, etc. The design focuses on the structure of the robot body which consists of four legs having three degrees of freedom per leg and the electronics system involved in it. The robot is built using readily available plastics and metals. The proposed robot is simple in construction and is able to move through uneven terrain, detect and locate obstacles and take images while carrying additional loads which may include hardware and sensors. The robot will find possible application in the artificial intelligence sector.

Keywords: artificial intelligence, bionic, quadruped robot, degree of freedom

Procedia PDF Downloads 220
30488 Ground-Structure Interaction Analysis of Aged Tunnels

Authors: Behrang Dadfar, Hossein Bidhendi, Jimmy Susetyo, John Paul Abbatangelo

Abstract:

Finding structural demand under various conditions that a structure may experience during its service life is an important step towards structural life-cycle analysis. In this paper, structural demand for the precast concrete tunnel lining (PCTL) segments of Toronto’s 60-year-old subway tunnels is investigated. Numerical modelling was conducted using FLAC3D, a finite difference-based software capable of simulating ground-structure interaction and ground material’s flow in three dimensions. The specific structural details of the segmental tunnel lining, such as the convex shape of the PCTL segments at radial joints and the PCTL segment pockets, were considered in the numerical modelling. Also, the model was developed in a way to accommodate the flexibility required for the simulation of various deterioration scenarios, shapes, and patterns that have been observed over more than 20 years. The soil behavior was simulated by using plastic-hardening constitutive model of FLAC3D. The effect of the depth of the tunnel, the coefficient of lateral earth pressure as well as the patterns of deterioration of the segments were studied. The structural capacity under various deterioration patterns and the existing loading conditions was evaluated using axial-flexural interaction curves that were developed for each deterioration pattern. The results were used to provide recommendations for the next phase of tunnel lining rehabilitation program.

Keywords: precast concrete tunnel lining, ground-structure interaction, numerical modelling, deterioration, tunnels

Procedia PDF Downloads 169