Search results for: data mining applications and discovery
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30893

Search results for: data mining applications and discovery

27833 Systematic Identification of Noncoding Cancer Driver Somatic Mutations

Authors: Zohar Manber, Ran Elkon

Abstract:

Accumulation of somatic mutations (SMs) in the genome is a major driving force of cancer development. Most SMs in the tumor's genome are functionally neutral; however, some cause damage to critical processes and provide the tumor with a selective growth advantage (termed cancer driver mutations). Current research on functional significance of SMs is mainly focused on finding alterations in protein coding sequences. However, the exome comprises only 3% of the human genome, and thus, SMs in the noncoding genome significantly outnumber those that map to protein-coding regions. Although our understanding of noncoding driver SMs is very rudimentary, it is likely that disruption of regulatory elements in the genome is an important, yet largely underexplored mechanism by which somatic mutations contribute to cancer development. The expression of most human genes is controlled by multiple enhancers, and therefore, it is conceivable that regulatory SMs are distributed across different enhancers of the same target gene. Yet, to date, most statistical searches for regulatory SMs have considered each regulatory element individually, which may reduce statistical power. The first challenge in considering the cumulative activity of all the enhancers of a gene as a single unit is to map enhancers to their target promoters. Such mapping defines for each gene its set of regulating enhancers (termed "set of regulatory elements" (SRE)). Considering multiple enhancers of each gene as one unit holds great promise for enhancing the identification of driver regulatory SMs. However, the success of this approach is greatly dependent on the availability of comprehensive and accurate enhancer-promoter (E-P) maps. To date, the discovery of driver regulatory SMs has been hindered by insufficient sample sizes and statistical analyses that often considered each regulatory element separately. In this study, we analyzed more than 2,500 whole-genome sequence (WGS) samples provided by The Cancer Genome Atlas (TCGA) and The International Cancer Genome Consortium (ICGC) in order to identify such driver regulatory SMs. Our analyses took into account the combinatorial aspect of gene regulation by considering all the enhancers that control the same target gene as one unit, based on E-P maps from three genomics resources. The identification of candidate driver noncoding SMs is based on their recurrence. We searched for SREs of genes that are "hotspots" for SMs (that is, they accumulate SMs at a significantly elevated rate). To test the statistical significance of recurrence of SMs within a gene's SRE, we used both global and local background mutation rates. Using this approach, we detected - in seven different cancer types - numerous "hotspots" for SMs. To support the functional significance of these recurrent noncoding SMs, we further examined their association with the expression level of their target gene (using gene expression data provided by the ICGC and TCGA for samples that were also analyzed by WGS).

Keywords: cancer genomics, enhancers, noncoding genome, regulatory elements

Procedia PDF Downloads 106
27832 A Multivariate 4/2 Stochastic Covariance Model: Properties and Applications to Portfolio Decisions

Authors: Yuyang Cheng, Marcos Escobar-Anel

Abstract:

This paper introduces a multivariate 4/2 stochastic covariance process generalizing the one-dimensional counterparts presented in Grasselli (2017). Our construction permits stochastic correlation not only among stocks but also among volatilities, also known as co-volatility movements, both driven by more convenient 4/2 stochastic structures. The parametrization is flexible enough to separate these types of correlation, permitting their individual study. Conditions for proper changes of measure and closed-form characteristic functions under risk-neutral and historical measures are provided, allowing for applications of the model to risk management and derivative pricing. We apply the model to an expected utility theory problem in incomplete markets. Our analysis leads to closed-form solutions for the optimal allocation and value function. Conditions are provided for well-defined solutions together with a verification theorem. Our numerical analysis highlights and separates the impact of key statistics on equity portfolio decisions, in particular, volatility, correlation, and co-volatility movements, with the latter being the least important in an incomplete market.

Keywords: stochastic covariance process, 4/2 stochastic volatility model, stochastic co-volatility movements, characteristic function, expected utility theory, veri cation theorem

Procedia PDF Downloads 159
27831 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 264
27830 Learning Grammars for Detection of Disaster-Related Micro Events

Authors: Josef Steinberger, Vanni Zavarella, Hristo Tanev

Abstract:

Natural disasters cause tens of thousands of victims and massive material damages. We refer to all those events caused by natural disasters, such as damage on people, infrastructure, vehicles, services and resource supply, as micro events. This paper addresses the problem of micro - event detection in online media sources. We present a natural language grammar learning algorithm and apply it to online news. The algorithm in question is based on distributional clustering and detection of word collocations. We also explore the extraction of micro-events from social media and describe a Twitter mining robot, who uses combinations of keywords to detect tweets which talk about effects of disasters.

Keywords: online news, natural language processing, machine learning, event extraction, crisis computing, disaster effects, Twitter

Procedia PDF Downloads 484
27829 The Inherent Flaw in the NBA Playoff Structure

Authors: Larry Turkish

Abstract:

Introduction: The NBA is an example of mediocrity and this will be evident in the following paper. The study examines and evaluates the characteristics of the NBA champions. As divisions and playoff teams increase, there is an increase in the probability that the champion originates from the mediocre category. Since it’s inception in 1947, the league has been mediocre and continues to this day. Why does a professional league allow any team with a less than 50% winning percentage into the playoffs? As long as the finances flow into the league, owners will not change the current algorithm. The objective of this paper is to determine if the regular season has meaning in finding an NBA champion. Statistical Analysis: The data originates from the NBA website. The following variables are part of the statistical analysis: Rank, the rank of a team relative to other teams in the league based on the regular season win-loss record; Winning Percentage of a team based on the regular season; Divisions, the number of divisions within the league and Playoff Teams, the number of playoff teams relative to a particular season. The following statistical applications are applied to the data: Pearson Product-Moment Correlation, Analysis of Variance, Factor and Regression analysis. Conclusion: The results indicate that the divisional structure and number of playoff teams results in a negative effect on the winning percentage of playoff teams. It also prevents teams with higher winning percentages from accessing the playoffs. Recommendations: 1. Teams that have a winning percentage greater than 1 standard deviation from the mean from the regular season will have access to playoffs. (Eliminates mediocre teams.) 2. Eliminate Divisions (Eliminates weaker teams from access to playoffs.) 3. Eliminate Conferences (Eliminates weaker teams from access to the playoffs.) 4. Have a balanced regular season schedule, (Reduces the number of regular season games, creates equilibrium, reduces bias) that will reduce the need for load management.

Keywords: alignment, mediocrity, regression, z-score

Procedia PDF Downloads 133
27828 The Effect of Language and Literature Integration on the Teaching of English Vocabulary and Grammar in Secondary Schools in Zamfara State, Nigeria

Authors: Umar Bello

Abstract:

Literature has become an invaluable subject which has added a great value and contribution to the teaching of English language and the discovery of many other developed ideas. Literature produces an exhilarating impulse that imprints a lasting picture on the mind of a learner. Many researchers have devised various means and approaches to language Teaching methods which remain unconvinging and which yield little result, but it has remained unconvincing because it has only produced little results. Devicing a method that eliminates monotony and boredome to learners is a good factor that enhances students’ motivation to learning. In this sense, literature and language become unavoidable components that aid intellectual development. This study examines the indispensability of literature as a means of English Language teaching to secondary school classes. The researcher has developed many instructive activities which are believed will help students to improve their study in grammar and vocabulary. The researcher has used quasi-experimental approach using experimental group and control group to find out how literature enhances the students grammar as well as their vocabulary. The findings revealed a positive performance in the experimental group doing better than the control group using simple percentage. The results make it clear that literature allows learners to pay more attention and develop more interest to their studies. In giving a perspicacious linguistic development, literature therefore remains an essential tool for language teaching classrooms, thereby enhancing their grammatical and vocabulary usage.

Keywords: teaching vocabulary, integration, poetry, classroom

Procedia PDF Downloads 109
27827 Milling Simulations with a 3-DOF Flexible Planar Robot

Authors: Hoai Nam Huynh, Edouard Rivière-Lorphèvre, Olivier Verlinden

Abstract:

Manufacturing technologies are becoming continuously more diversified over the years. The increasing use of robots for various applications such as assembling, painting, welding has also affected the field of machining. Machining robots can deal with larger workspaces than conventional machine-tools at a lower cost and thus represent a very promising alternative for machining applications. Furthermore, their inherent structure ensures them a great flexibility of motion to reach any location on the workpiece with the desired orientation. Nevertheless, machining robots suffer from a lack of stiffness at their joints restricting their use to applications involving low cutting forces especially finishing operations. Vibratory instabilities may also happen while machining and deteriorate the precision leading to scrap parts. Some researchers are therefore concerned with the identification of optimal parameters in robotic machining. This paper continues the development of a virtual robotic machining simulator in order to find optimized cutting parameters in terms of depth of cut or feed per tooth for example. The simulation environment combines an in-house milling routine (DyStaMill) achieving the computation of cutting forces and material removal with an in-house multibody library (EasyDyn) which is used to build a dynamic model of a 3-DOF planar robot with flexible links. The position of the robot end-effector submitted to milling forces is controlled through an inverse kinematics scheme while controlling the position of its joints separately. Each joint is actuated through a servomotor for which the transfer function has been computed in order to tune the corresponding controller. The output results feature the evolution of the cutting forces when the robot structure is deformable or not and the tracking errors of the end-effector. Illustrations of the resulting machined surfaces are also presented. The consideration of the links flexibility has highlighted an increase of the cutting forces magnitude. This proof of concept will aim to enrich the database of results in robotic machining for potential improvements in production.

Keywords: control, milling, multibody, robotic, simulation

Procedia PDF Downloads 252
27826 Sampled-Data Model Predictive Tracking Control for Mobile Robot

Authors: Wookyong Kwon, Sangmoon Lee

Abstract:

In this paper, a sampled-data model predictive tracking control method is presented for mobile robots which is modeled as constrained continuous-time linear parameter varying (LPV) systems. The presented sampled-data predictive controller is designed by linear matrix inequality approach. Based on the input delay approach, a controller design condition is derived by constructing a new Lyapunov function. Finally, a numerical example is given to demonstrate the effectiveness of the presented method.

Keywords: model predictive control, sampled-data control, linear parameter varying systems, LPV

Procedia PDF Downloads 315
27825 The Use of Videoconferencing in a Task-Based Beginners' Chinese Class

Authors: Sijia Guo

Abstract:

The development of new technologies and the falling cost of high-speed Internet access have made it easier for institutes and language teachers to opt different ways to communicate with students at distance. The emergence of web-conferencing applications, which integrate text, chat, audio / video and graphic facilities, offers great opportunities for language learning to through the multimodal environment. This paper reports on data elicited from a Ph.D. study of using web-conferencing in the teaching of first-year Chinese class in order to promote learners’ collaborative learning. Firstly, a comparison of four desktop videoconferencing (DVC) tools was conducted to determine the pedagogical value of the videoconferencing tool-Blackboard Collaborate. Secondly, the evaluation of 14 campus-based Chinese learners who conducted five one-hour online sessions via the multimodal environment reveals the users’ choice of modes and their learning preference. The findings show that the tasks designed for the web-conferencing environment contributed to the learners’ collaborative learning and second language acquisition.

Keywords: computer-mediated communication (CMC), CALL evaluation, TBLT, web-conferencing, online Chinese teaching

Procedia PDF Downloads 311
27824 Investigating Interlayer Bonding in 3D Printing Pressure Vessel Applications

Authors: Cam Minh Tri Tien, Richard Fenrich, Tristan Shelley, Nam Mai-Duy, Allan Malano, Xuesen Zeng

Abstract:

Since additive manufacturing is a layer-by-layer deposition approach, good bonding quality between adjacent layers is critically important to achieve optimal mechanical performance, including applications in pressure vessels. The need to enhance the strength of printed products, especially in the build direction where layup gaps and voids exist between the printed layers, has garnered significant attention. The proposed research will focus on improving the current Fused Deposition Modelling (FDM) process to produce polymers reinforced with chopped fibers, utilizing a controlled heat zone to enhance the adhesion between printed layers. Energy will be applied to both printed and printing layers to improve the bonding strength between adjacent layers. Through the enhanced FDM process, the mechanical performance of composite parts will experience a substantial improvement, particularly in the build direction, as compared to current FDM methods. A combination of experimental, numerical, and analytical methods will be employed to demonstrate the enhanced performance of heat-controlled 3D printed parts.

Keywords: 3D Printing, pressure vessels, interlayer bonding, controlled heat

Procedia PDF Downloads 56
27823 Land Use Change Detection Using Satellite Images for Najran City, Kingdom of Saudi Arabia (KSA)

Authors: Ismail Elkhrachy

Abstract:

Determination of land use changing is an important component of regional planning for applications ranging from urban fringe change detection to monitoring change detection of land use. This data are very useful for natural resources management.On the other hand, the technologies and methods of change detection also have evolved dramatically during past 20 years. So it has been well recognized that the change detection had become the best methods for researching dynamic change of land use by multi-temporal remotely-sensed data. The objective of this paper is to assess, evaluate and monitor land use change surrounding the area of Najran city, Kingdom of Saudi Arabia (KSA) using Landsat images (June 23, 2009) and ETM+ image(June. 21, 2014). The post-classification change detection technique was applied. At last,two-time subset images of Najran city are compared on a pixel-by-pixel basis using the post-classification comparison method and the from-to change matrix is produced, the land use change information obtained.Three classes were obtained, urban, bare land and agricultural land from unsupervised classification method by using Erdas Imagine and ArcGIS software. Accuracy assessment of classification has been performed before calculating change detection for study area. The obtained accuracy is between 61% to 87% percent for all the classes. Change detection analysis shows that rapid growth in urban area has been increased by 73.2%, the agricultural area has been decreased by 10.5 % and barren area reduced by 7% between 2009 and 2014. The quantitative study indicated that the area of urban class has unchanged by 58.2 km〗^2, gained 70.3 〖km〗^2 and lost 16 〖km〗^2. For bare land class 586.4〖km〗^2 has unchanged, 53.2〖km〗^2 has gained and 101.5〖km〗^2 has lost. While agriculture area class, 20.2〖km〗^2 has unchanged, 31.2〖km〗^2 has gained and 37.2〖km〗^2 has lost.

Keywords: land use, remote sensing, change detection, satellite images, image classification

Procedia PDF Downloads 527
27822 Timely Detection and Identification of Abnormalities for Process Monitoring

Authors: Hyun-Woo Cho

Abstract:

The detection and identification of multivariate manufacturing processes are quite important in order to maintain good product quality. Unusual behaviors or events encountered during its operation can have a serious impact on the process and product quality. Thus they should be detected and identified as soon as possible. This paper focused on the efficient representation of process measurement data in detecting and identifying abnormalities. This qualitative method is effective in representing fault patterns of process data. In addition, it is quite sensitive to measurement noise so that reliable outcomes can be obtained. To evaluate its performance a simulation process was utilized, and the effect of adopting linear and nonlinear methods in the detection and identification was tested with different simulation data. It has shown that the use of a nonlinear technique produced more satisfactory and more robust results for the simulation data sets. This monitoring framework can help operating personnel to detect the occurrence of process abnormalities and identify their assignable causes in an on-line or real-time basis.

Keywords: detection, monitoring, identification, measurement data, multivariate techniques

Procedia PDF Downloads 241
27821 Imputation of Urban Movement Patterns Using Big Data

Authors: Eusebio Odiari, Mark Birkin, Susan Grant-Muller, Nicolas Malleson

Abstract:

Big data typically refers to consumer datasets revealing some detailed heterogeneity in human behavior, which if harnessed appropriately, could potentially revolutionize our understanding of the collective phenomena of the physical world. Inadvertent missing values skew these datasets and compromise the validity of the thesis. Here we discuss a conceptually consistent strategy for identifying other relevant datasets to combine with available big data, to plug the gaps and to create a rich requisite comprehensive dataset for subsequent analysis. Specifically, emphasis is on how these methodologies can for the first time enable the construction of more detailed pictures of passenger demand and drivers of mobility on the railways. These methodologies can predict the influence of changes within the network (like a change in time-table or impact of a new station), explain local phenomena outside the network (like rail-heading) and the other impacts of urban morphology. Our analysis also reveals that our new imputation data model provides for more equitable revenue sharing amongst network operators who manage different parts of the integrated UK railways.

Keywords: big-data, micro-simulation, mobility, ticketing-data, commuters, transport, synthetic, population

Procedia PDF Downloads 232
27820 The Influence of Housing Choice Vouchers on the Private Rental Market

Authors: Randy D. Colon

Abstract:

Through a freedom of information request, data pertaining to Housing Choice Voucher (HCV) households has been obtained from the Chicago Housing Authority, including rent price and number of bedrooms per HCV household, community area, and zip code from 2013 to the first quarter of 2018. Similar data pertaining to the private rental market will be obtained through public records found through the United States Department of Housing and Urban Development. The datasets will be analyzed through statistical and mapping software to investigate the potential link between HCV households and distorted rent prices. Quantitative data will be supplemented by qualitative data to investigate the lived experience of Chicago residents. Qualitative data will be collected at community meetings in the Chicago Englewood neighborhood through participation in neighborhood meetings and informal interviews with residents and community leaders. The qualitative data will be used to gain insight on the lived experience of community leaders and residents of the Englewood neighborhood in relation to housing, the rental market, and HCV. While there is an abundance of quantitative data on this subject, this qualitative data is necessary to capture the lived experience of local residents effected by a changing rental market. This topic reflects concerns voiced by members of the Englewood community, and this study aims to keep the community relevant in its findings.

Keywords: Chicago, housing, housing choice voucher program, housing subsidies, rental market

Procedia PDF Downloads 123
27819 The Dynamic Metadata Schema in Neutron and Photon Communities: A Case Study of X-Ray Photon Correlation Spectroscopy

Authors: Amir Tosson, Mohammad Reza, Christian Gutt

Abstract:

Metadata stands at the forefront of advancing data management practices within research communities, with particular significance in the realms of neutron and photon scattering. This paper introduces a groundbreaking approach—dynamic metadata schema—within the context of X-ray Photon Correlation Spectroscopy (XPCS). XPCS, a potent technique unravelling nanoscale dynamic processes, serves as an illustrative use case to demonstrate how dynamic metadata can revolutionize data acquisition, sharing, and analysis workflows. This paper explores the challenges encountered by the neutron and photon communities in navigating intricate data landscapes and highlights the prowess of dynamic metadata in addressing these hurdles. Our proposed approach empowers researchers to tailor metadata definitions to the evolving demands of experiments, thereby facilitating streamlined data integration, traceability, and collaborative exploration. Through tangible examples from the XPCS domain, we showcase how embracing dynamic metadata standards bestows advantages, enhancing data reproducibility, interoperability, and the diffusion of knowledge. Ultimately, this paper underscores the transformative potential of dynamic metadata, heralding a paradigm shift in data management within the neutron and photon research communities.

Keywords: metadata, FAIR, data analysis, XPCS, IoT

Procedia PDF Downloads 67
27818 Investigation of Compressive Strength of Fly Ash-Based Geopolymer Bricks with Hierarchical Bayesian Path Analysis

Authors: Ersin Sener, Ibrahim Demir, Hasan Aykut Karaboga, Kadir Kilinc

Abstract:

Bayesian methods, which have very wide range of applications, are implemented to the data obtained from the production of F class fly ash-based geopolymer bricks’ experimental design. In this study, dependent variable is compressive strength, independent variables are treatment type (oven and steam), treatment time, molding time, temperature, water absorbtion ratio and density. The effect of independent variables on compressive strength is investigated. There is no difference among treatment types, but there is a correlation between independent variables. Therefore, hierarchical Bayesian path analysis is applied. In consequence of analysis we specified that treatment time, temperature and density effects on compressive strength is higher, molding time, and water absorbtion ratio is relatively low.

Keywords: experimental design, F class fly ash, geopolymer bricks, hierarchical Bayesian path analysis

Procedia PDF Downloads 389
27817 The Effect of Recycling on Price Volatility of Critical Metals in the EU (2010-2019): An Application of Multivariate GARCH Family Models

Authors: Marc Evenst Jn Jacques, Sophie Bernard

Abstract:

Electrical and electronic applications, as well as rechargeable batteries, are common in any economy. They also contain a number of important and valuable metals. It is critical to investigate the impact of these new materials or volume sources on the metal market dynamics. This paper investigates the impact of responsible recycling within the European region on metal price volatility. As far as we know, no empirical studies have been conducted to assess the role of metal recycling in metal market price volatility. The goal of this paper is to test the claim that metal recycling helps to cushion price volatility. A set of circular economy indicators/variables, namely, 1) annual total trade values of recycled metals, 2) annual volume of scrap traded and 3) circular material use rate, and 4) information about recycling, are used to estimate the volatility of monthly spot prices of regular metals. A combination of the GARCH-MIDAS model for mixed frequency data sampling and a simple GARCH (1,1) model for the same frequency variables was adopted to examine the potential links between each variable and price volatility. We discovered that from 2010 to 2019, except for Nickel, scrap consumption (Millions of tons), Scrap Trade Values, and Recycled Material use rate had no significant impact on the price volatility of standard metals (Aluminum, Lead) and precious metals (Gold and Platinum). Worldwide interest in recycling has no impact on returns or volatility. Specific interest in metal recycling did have a link to the mean return equation for Aluminum, Gold and to the volatility equation for lead and Nickel.

Keywords: recycling, circular economy, price volatility, GARCH, mixed data sampling

Procedia PDF Downloads 62
27816 Exploring SSD Suitable Allocation Schemes Incompliance with Workload Patterns

Authors: Jae Young Park, Hwansu Jung, Jong Tae Kim

Abstract:

Whether the data has been well parallelized is an important factor in the Solid-State-Drive (SSD) performance. SSD parallelization is affected by allocation scheme and it is directly connected to SSD performance. There are dynamic allocation and static allocation in representative allocation schemes. Dynamic allocation is more adaptive in exploiting write operation parallelism, while static allocation is better in read operation parallelism. Therefore, it is hard to select the appropriate allocation scheme when the workload is mixed read and write operations. We simulated conditions on a few mixed data patterns and analyzed the results to help the right choice for better performance. As the results, if data arrival interval is long enough prior operations to be finished and continuous read intensive data environment static allocation is more suitable. Dynamic allocation performs the best on write performance and random data patterns.

Keywords: dynamic allocation, NAND flash based SSD, SSD parallelism, static allocation

Procedia PDF Downloads 342
27815 Exchange Bias in Ceramics: From Polyol Made CoFe₂O₄-core@CoO-Shell NPs to Nanostructured Ceramics

Authors: N. Flores-Martinez, G. Franceschin, T. Gaudisson, J.-M. Greneche, R. Valenzuela-Monjaras, S. Ammar

Abstract:

Tailoring bulk materials keeping their nanoscale properties is the daydream of material scientists. But especially in magnetism, this single desire can revolutionize our everyday life. Now, thanks to the methods of synthesis, based on the combination of colloidal chemistry (CC) to flash sintering (FS), customizing magnets becomes each time more 'easy', 'cheap' and 'clean'. Although by CC we can obtain straightway nanopowders with good magnetic featuring, like exchange bias (EB) phenomenon, it does not result so attractive for applications. Since a solid material is simple to manipulate and integrate in a device, many consolidation methods have been tested aiming to keep the nanopowders characteristics after consolidation. Unfortunately, the lack of structural crystalline arrangement and the grain growth worsen the magnetic properties. In this work, we exhibit, for the first-time author’s best knowledge, the EB in sintered ceramics, starting from CoFe₂O₄-core@CoO-shell NPs obtained by CC. Despite the fact that EB field is about 28 mT in ceramics and it is not yet considered for applications, this work opens an alternative in the permanent magnets fabrication through a FS method, the spark plasma sintering, starting from CC synthesized nanopowders.

Keywords: core-shell nanoparticles, exchange bias, nanostructured ceramics, spark plasma sintering

Procedia PDF Downloads 152
27814 Hydrological Analysis for Urban Water Management

Authors: Ranjit Kumar Sahu, Ramakar Jha

Abstract:

Urban Water Management is the practice of managing freshwater, waste water, and storm water as components of a basin-wide management plan. It builds on existing water supply and sanitation considerations within an urban settlement by incorporating urban water management within the scope of the entire river basin. The pervasive problems generated by urban development have prompted, in the present work, to study the spatial extent of urbanization in Golden Triangle of Odisha connecting the cities Bhubaneswar (20.2700° N, 85.8400° E), Puri (19.8106° N, 85.8314° E) and Konark (19.9000° N, 86.1200° E)., and patterns of periodic changes in urban development (systematic/random) in order to develop future plans for (i) urbanization promotion areas, and (ii) urbanization control areas. Remote Sensing, using USGS (U.S. Geological Survey) Landsat8 maps, supervised classification of the Urban Sprawl has been done for during 1980 - 2014, specifically after 2000. This Work presents the following: (i) Time series analysis of Hydrological data (ground water and rainfall), (ii) Application of SWMM (Storm Water Management Model) and other soft computing techniques for Urban Water Management, and (iii) Uncertainty analysis of model parameters (Urban Sprawl and correlation analysis). The outcome of the study shows drastic growth results in urbanization and depletion of ground water levels in the area that has been discussed briefly. Other relative outcomes like declining trend of rainfall and rise of sand mining in local vicinity has been also discussed. Research on this kind of work will (i) improve water supply and consumption efficiency (ii) Upgrade drinking water quality and waste water treatment (iii) Increase economic efficiency of services to sustain operations and investments for water, waste water, and storm water management, and (iv) engage communities to reflect their needs and knowledge for water management.

Keywords: Storm Water Management Model (SWMM), uncertainty analysis, urban sprawl, land use change

Procedia PDF Downloads 429
27813 Power Quality Modeling Using Recognition Learning Methods for Waveform Disturbances

Authors: Sang-Keun Moon, Hong-Rok Lim, Jin-O Kim

Abstract:

This paper presents a Power Quality (PQ) modeling and filtering processes for the distribution system disturbances using recognition learning methods. Typical PQ waveforms with mathematical applications and gathered field data are applied to the proposed models. The objective of this paper is analyzing PQ data with respect to monitoring, discriminating, and evaluating the waveform of power disturbances to ensure the system preventative system failure protections and complex system problem estimations. Examined signal filtering techniques are used for the field waveform noises and feature extractions. Using extraction and learning classification techniques, the efficiency was verified for the recognition of the PQ disturbances with focusing on interactive modeling methods in this paper. The waveform of selected 8 disturbances is modeled with randomized parameters of IEEE 1159 PQ ranges. The range, parameters, and weights are updated regarding field waveform obtained. Along with voltages, currents have same process to obtain the waveform features as the voltage apart from some of ratings and filters. Changing loads are causing the distortion in the voltage waveform due to the drawing of the different patterns of current variation. In the conclusion, PQ disturbances in the voltage and current waveforms indicate different types of patterns of variations and disturbance, and a modified technique based on the symmetrical components in time domain was proposed in this paper for the PQ disturbances detection and then classification. Our method is based on the fact that obtained waveforms from suggested trigger conditions contain potential information for abnormality detections. The extracted features are sequentially applied to estimation and recognition learning modules for further studies.

Keywords: power quality recognition, PQ modeling, waveform feature extraction, disturbance trigger condition, PQ signal filtering

Procedia PDF Downloads 191
27812 Social Data Aggregator and Locator of Knowledge (STALK)

Authors: Rashmi Raghunandan, Sanjana Shankar, Rakshitha K. Bhat

Abstract:

Social media contributes a vast amount of data and information about individuals to the internet. This project will greatly reduce the need for unnecessary manual analysis of large and diverse social media profiles by filtering out and combining the useful information from various social media profiles, eliminating irrelevant data. It differs from the existing social media aggregators in that it does not provide a consolidated view of various profiles. Instead, it provides consolidated INFORMATION derived from the subject’s posts and other activities. It also allows analysis over multiple profiles and analytics based on several profiles. We strive to provide a query system to provide a natural language answer to questions when a user does not wish to go through the entire profile. The information provided can be filtered according to the different use cases it is used for.

Keywords: social network, analysis, Facebook, Linkedin, git, big data

Procedia PDF Downloads 446
27811 Data Integrity between Ministry of Education and Private Schools in the United Arab Emirates

Authors: Rima Shishakly, Mervyn Misajon

Abstract:

Education is similar to other businesses and industries. Achieving data integrity is essential in order to attain a significant supporting for all the stakeholders in the educational sector. Efficient data collect, flow, processing, storing and retrieving are vital in order to deliver successful solutions to the different stakeholders. Ministry of Education (MOE) in United Arab Emirates (UAE) has adopted ‘Education 2020’ a series of five-year plans designed to introduce advanced education management information systems. As part of this program, in 2010 MOE implemented Student Information Systems (SIS) to manage and monitor the students’ data and information flow between MOE and international private schools in UAE. This paper is going to discuss data integrity concerns between MOE, and private schools. The paper will clarify the data integrity issues and will indicate the challenges that face private schools in UAE.

Keywords: education management information systems (EMIS), student information system (SIS), United Arab Emirates (UAE), ministry of education (MOE), (KHDA) the knowledge and human development authority, Abu Dhabi educational counsel (ADEC)

Procedia PDF Downloads 226
27810 Organic Thin-Film Transistors with High Thermal Stability

Authors: Sibani Bisoyi, Ute Zschieschang, Alexander Hoyer, Hagen Klauk

Abstract:

Abstract— Organic thin-film transistors (TFTs) have great potential to be used for various applications such as flexible displays or sensors. For some of these applications, the TFTs must be able to withstand temperatures in excess of 100 °C, for example to permit the integration with devices or components that require high process temperatures, or to make it possible that the devices can be subjected to the standard sterilization protocols required for biomedical applications. In this work, we have investigated how the thermal stability of low-voltage small-molecule semiconductor dinaphtho[2,3-b:2’,3’-f]thieno[3,2-b]thiophene (DNTT) TFTs is affected by the encapsulation of the TFTs and by the ambient in which the thermal stress is performed. We also studied to which extent the thermal stability of the TFTs depends on the channel length. Some of the TFTs were encapsulated with a layer of vacuum-deposited Teflon, while others were left without encapsulation, and the thermal stress was performed either in nitrogen or in air. We found that the encapsulation with Teflon has virtually no effect on the thermal stability of our TFTs. In contrast, the ambient in which the thermal stress is conducted was found to have a measurable effect, but in a surprising way: When the thermal stress is carried out in nitrogen, the mobility drops to 70% of its initial value at a temperature of 160 °C and to close to zero at 170 °C, whereas when the stress is performed in air, the mobility remains at 75% of its initial value up to a temperature of 160 °C and at 60% up to 180 °C. To understand this behavior, we studied the effect of the thermal stress on the semiconductor thin-film morphology by scanning electron microscopy. While the DNTT films remain continuous and conducting when the heating is carried out in air, the semiconductor morphology undergoes a dramatic change, including the formation of large, thick crystals of DNTT and a complete loss of percolation, when the heating is conducted in nitrogen. We also found that when the TFTs are heated to a temperature of 200 °C in air, all TFTs with a channel length greater than 50 µm are destroyed, while TFTs with a channel length of less than 50 µm survive, whereas when the TFTs are heated to the same temperature (200 °C) in nitrogen, only the TFTs with a channel smaller than 8 µm survive. This result is also linked to the thermally induced changes in the semiconductor morphology.

Keywords: organic thin-film transistors, encapsulation, thermal stability, thin-film morphology

Procedia PDF Downloads 352
27809 Nadler's Fixed Point Theorem on Partial Metric Spaces and its Application to a Homotopy Result

Authors: Hemant Kumar Pathak

Abstract:

In 1994, Matthews (S.G. Matthews, Partial metric topology, in: Proc. 8th Summer Conference on General Topology and Applications, in: Ann. New York Acad. Sci., vol. 728, 1994, pp. 183-197) introduced the concept of a partial metric as a part of the study of denotational semantics of data flow networks. He gave a modified version of the Banach contraction principle, more suitable in this context. In fact, (complete) partial metric spaces constitute a suitable framework to model several distinguished examples of the theory of computation and also to model metric spaces via domain theory. In this paper, we introduce the concept of almost partial Hausdorff metric. We prove a fixed point theorem for multi-valued mappings on partial metric space using the concept of almost partial Hausdorff metric and prove an analogous to the well-known Nadler’s fixed point theorem. In the sequel, we derive a homotopy result as an application of our main result.

Keywords: fixed point, partial metric space, homotopy, physical sciences

Procedia PDF Downloads 447
27808 Feasibility of an Extreme Wind Risk Assessment Software for Industrial Applications

Authors: Francesco Pandolfi, Georgios Baltzopoulos, Iunio Iervolino

Abstract:

The impact of extreme winds on industrial assets and the built environment is gaining increasing attention from stakeholders, including the corporate insurance industry. This has led to a progressively more in-depth study of building vulnerability and fragility to wind. Wind vulnerability models are used in probabilistic risk assessment to relate a loss metric to an intensity measure of the natural event, usually a gust or a mean wind speed. In fact, vulnerability models can be integrated with the wind hazard, which consists of associating a probability to each intensity level in a time interval (e.g., by means of return periods) to provide an assessment of future losses due to extreme wind. This has also given impulse to the world- and regional-scale wind hazard studies.Another approach often adopted for the probabilistic description of building vulnerability to the wind is the use of fragility functions, which provide the conditional probability that selected building components will exceed certain damage states, given wind intensity. In fact, in wind engineering literature, it is more common to find structural system- or component-level fragility functions rather than wind vulnerability models for an entire building. Loss assessment based on component fragilities requires some logical combination rules that define the building’s damage state given the damage state of each component and the availability of a consequence model that provides the losses associated with each damage state. When risk calculations are based on numerical simulation of a structure’s behavior during extreme wind scenarios, the interaction of component fragilities is intertwined with the computational procedure. However, simulation-based approaches are usually computationally demanding and case-specific. In this context, the present work introduces the ExtReMe wind risk assESsment prototype Software, ERMESS, which is being developed at the University of Naples Federico II. ERMESS is a wind risk assessment tool for insurance applications to industrial facilities, collecting a wide assortment of available wind vulnerability models and fragility functions to facilitate their incorporation into risk calculations based on in-built or user-defined wind hazard data. This software implements an alternative method for building-specific risk assessment based on existing component-level fragility functions and on a number of simplifying assumptions for their interactions. The applicability of this alternative procedure is explored by means of an illustrative proof-of-concept example, which considers four main building components, namely: the roof covering, roof structure, envelope wall and envelope openings. The application shows that, despite the simplifying assumptions, the procedure can yield risk evaluations that are comparable to those obtained via more rigorous building-level simulation-based methods, at least in the considered example. The advantage of this approach is shown to lie in the fact that a database of building component fragility curves can be put to use for the development of new wind vulnerability models to cover building typologies not yet adequately covered by existing works and whose rigorous development is usually beyond the budget of portfolio-related industrial applications.

Keywords: component wind fragility, probabilistic risk assessment, vulnerability model, wind-induced losses

Procedia PDF Downloads 183
27807 Towards a Balancing Medical Database by Using the Least Mean Square Algorithm

Authors: Kamel Belammi, Houria Fatrim

Abstract:

imbalanced data set, a problem often found in real world application, can cause seriously negative effect on classification performance of machine learning algorithms. There have been many attempts at dealing with classification of imbalanced data sets. In medical diagnosis classification, we often face the imbalanced number of data samples between the classes in which there are not enough samples in rare classes. In this paper, we proposed a learning method based on a cost sensitive extension of Least Mean Square (LMS) algorithm that penalizes errors of different samples with different weight and some rules of thumb to determine those weights. After the balancing phase, we applythe different classifiers (support vector machine (SVM), k- nearest neighbor (KNN) and multilayer neuronal networks (MNN)) for balanced data set. We have also compared the obtained results before and after balancing method.

Keywords: multilayer neural networks, k- nearest neighbor, support vector machine, imbalanced medical data, least mean square algorithm, diabetes

Procedia PDF Downloads 536
27806 Complementary Split Ring Resonator-Loaded Microstrip Patch Antenna Useful for Microwave Communication

Authors: Subal Kar, Madhuja Ghosh, Amitesh Kumar, Arijit Majumder

Abstract:

Complementary split-ring resonator (CSRR) loaded microstrip square patch antenna has been optimally designed with the help of high frequency structure simulator (HFSS). The antenna has been fabricated on the basis of the simulation design data and experimentally tested in anechoic chamber to evaluate its gain, bandwidth, efficiency and polarization characteristics. The CSRR loaded microstrip patch antenna has been found to realize significant size miniaturization (to the extent of 24%) compared to the conventional-type microstrip patch antenna both operating at the same frequency (5.2 GHz). The fabricated antenna could realize a maximum gain of 4.17 dB, 10 dB impedance bandwidth of 34 MHz, efficiency 50.73% and with maximum cross-pol of 10.56 dB down at the operating frequency. This practically designed antenna with its miniaturized size is expected to be useful for airborne and space borne applications at microwave frequency.

Keywords: split ring resonator, metamaterial, CSRR loaded patch antenna, microstrip patch antenna, LC resonator

Procedia PDF Downloads 360
27805 Review of the Road Crash Data Availability in Iraq

Authors: Abeer K. Jameel, Harry Evdorides

Abstract:

Iraq is a middle income country where the road safety issue is considered one of the leading causes of deaths. To control the road risk issue, the Iraqi Ministry of Planning, General Statistical Organization started to organise a collection system of traffic accidents data with details related to their causes and severity. These data are published as an annual report. In this paper, a review of the available crash data in Iraq will be presented. The available data represent the rate of accidents in aggregated level and classified according to their types, road users’ details, and crash severity, type of vehicles, causes and number of causalities. The review is according to the types of models used in road safety studies and research, and according to the required road safety data in the road constructions tasks. The available data are also compared with the road safety dataset published in the United Kingdom as an example of developed country. It is concluded that the data in Iraq are suitable for descriptive and exploratory models, aggregated level comparison analysis, and evaluation and monitoring the progress of the overall traffic safety performance. However, important traffic safety studies require disaggregated level of data and details related to the factors of the likelihood of traffic crashes. Some studies require spatial geographic details such as the location of the accidents which is essential in ranking the roads according to their level of safety, and name the most dangerous roads in Iraq which requires tactic plan to control this issue. Global Road safety agencies interested in solve this problem in low and middle-income countries have designed road safety assessment methodologies which are basing on the road attributes data only. Therefore, in this research it is recommended to use one of these methodologies.

Keywords: road safety, Iraq, crash data, road risk assessment, The International Road Assessment Program (iRAP)

Procedia PDF Downloads 261
27804 Role of Nano-Technology on Remediation of Poly- and Perfluoroalkyl Substances Contaminated Soil and Ground Water

Authors: Leila Alidokht

Abstract:

PFAS (poly- and perfluoroalkyl substances) are a large collection of environmentally persistent organic chemicals of industrial origin that have a negative influence on human health and ecosystems. Many distinct PFAS are being utilized in a wide range of applications (on the order of thousands), and there is no comprehensive source of information on the many different compounds and their roles in diverse applications. Facilities are increasingly looking into ways to reduce waste from cleanup projects. PFAS are widespread in the environment, have been found in a wide range of human biomonitoring investigations, and are a rising source of regulatory concern for federal, state, and local governments. Nanotechnology has the potential to contribute considerably to the creation of a cleaner, greener technologies with considerable environmental and health benefits. Nanotechnology approaches are being studied for their potential to provide pollution management and mitigation options, as well as to increase the effectiveness of standard environmental cleanup procedures. Diversified nanoparticles have shown useful in removing certain pollutants from their original environment, such as sewage spills and landmines. Furthermore, they have a low hazardous effect during production rates and can thus be thoroughly explored in the future to make them more compatible with lower production costs.

Keywords: PFOS, PFOA, PFAS, soil remediation

Procedia PDF Downloads 117