Search results for: paper analysis techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 47909

Search results for: paper analysis techniques

47759 Suggestion for Malware Detection Agent Considering Network Environment

Authors: Ji-Hoon Hong, Dong-Hee Kim, Nam-Uk Kim, Tai-Myoung Chung

Abstract:

Smartphone users are increasing rapidly. Accordingly, many companies are running BYOD (Bring Your Own Device: Policies to bring private-smartphones to the company) policy to increase work efficiency. However, smartphones are always under the threat of malware, thus the company network that is connected smartphone is exposed to serious risks. Most smartphone malware detection techniques are to perform an independent detection (perform the detection of a single target application). In this paper, we analyzed a variety of intrusion detection techniques. Based on the results of analysis propose an agent using the network IDS.

Keywords: android malware detection, software-defined network, interaction environment, android malware detection, software-defined network, interaction environment

Procedia PDF Downloads 433
47758 An Overview of Adaptive Channel Equalization Techniques and Algorithms

Authors: Navdeep Singh Randhawa

Abstract:

Wireless communication system has been proved as the best for any communication. However, there are some undesirable threats of a wireless communication channel on the information transmitted through it, such as attenuation, distortions, delays and phase shifts of the signals arriving at the receiver end which are caused by its band limited and dispersive nature. One of the threat is ISI (Inter Symbol Interference), which has been found as a great obstacle in high speed communication. Thus, there is a need to provide perfect and accurate technique to remove this effect to have an error free communication. Thus, different equalization techniques have been proposed in literature. This paper presents the equalization techniques followed by the concept of adaptive filter equalizer, its algorithms (LMS and RLS) and applications of adaptive equalization technique.

Keywords: channel equalization, adaptive equalizer, least mean square, recursive least square

Procedia PDF Downloads 450
47757 Electrocardiogram Signal Denoising Using a Hybrid Technique

Authors: R. Latif, W. Jenkal, A. Toumanari, A. Hatim

Abstract:

This paper presents an efficient method of electrocardiogram signal denoising based on a hybrid approach. Two techniques are brought together to create an efficient denoising process. The first is an Adaptive Dual Threshold Filter (ADTF) and the second is the Discrete Wavelet Transform (DWT). The presented approach is based on three steps of denoising, the DWT decomposition, the ADTF step and the highest peaks correction step. This paper presents some application of the approach on some electrocardiogram signals of the MIT-BIH database. The results of these applications are promising compared to other recently published techniques.

Keywords: hybrid technique, ADTF, DWT, thresholding, ECG signal

Procedia PDF Downloads 322
47756 Recent Advances in Data Warehouse

Authors: Fahad Hanash Alzahrani

Abstract:

This paper describes some recent advances in a quickly developing area of data storing and processing based on Data Warehouses and Data Mining techniques, which are associated with software, hardware, data mining algorithms and visualisation techniques having common features for any specific problems and tasks of their implementation.

Keywords: data warehouse, data mining, knowledge discovery in databases, on-line analytical processing

Procedia PDF Downloads 404
47755 Survey on Arabic Sentiment Analysis in Twitter

Authors: Sarah O. Alhumoud, Mawaheb I. Altuwaijri, Tarfa M. Albuhairi, Wejdan M. Alohaideb

Abstract:

Large-scale data stream analysis has become one of the important business and research priorities lately. Social networks like Twitter and other micro-blogging platforms hold an enormous amount of data that is large in volume, velocity and variety. Extracting valuable information and trends out of these data would aid in a better understanding and decision-making. Multiple analysis techniques are deployed for English content. Moreover, one of the languages that produce a large amount of data over social networks and is least analyzed is the Arabic language. The proposed paper is a survey on the research efforts to analyze the Arabic content in Twitter focusing on the tools and methods used to extract the sentiments for the Arabic content on Twitter.

Keywords: big data, social networks, sentiment analysis, twitter

Procedia PDF Downloads 576
47754 A Review on Existing Challenges of Data Mining and Future Research Perspectives

Authors: Hema Bhardwaj, D. Srinivasa Rao

Abstract:

Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.

Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges

Procedia PDF Downloads 110
47753 3D Object Detection for Autonomous Driving: A Comprehensive Review

Authors: Ahmed Soliman Nagiub, Mahmoud Fayez, Heba Khaled, Said Ghoniemy

Abstract:

Accurate perception is a critical component in enabling autonomous vehicles to understand their driving environment. The acquisition of 3D information about objects, including their location and pose, is essential for achieving this understanding. This survey paper presents a comprehensive review of 3D object detection techniques specifically tailored for autonomous vehicles. The survey begins with an introduction to 3D object detection, elucidating the significance of the third dimension in perceiving the driving environment. It explores the types of sensors utilized in this context and the corresponding data extracted from these sensors. Additionally, the survey investigates the different types of datasets employed, including their formats, sizes, and provides a comparative analysis. Furthermore, the paper categorizes and thoroughly examines the perception methods employed for 3D object detection based on the diverse range of sensors utilized. Each method is evaluated based on its effectiveness in accurately detecting objects in a three-dimensional space. Additionally, the evaluation metrics used to assess the performance of these methods are discussed. By offering a comprehensive overview of 3D object detection techniques for autonomous vehicles, this survey aims to advance the field of perception systems. It serves as a valuable resource for researchers and practitioners, providing insights into the techniques, sensors, and evaluation metrics employed in 3D object detection for autonomous vehicles.

Keywords: computer vision, 3D object detection, autonomous vehicles, deep learning

Procedia PDF Downloads 62
47752 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 244
47751 Micro-CT Imaging Of Hard Tissues

Authors: Amir Davood Elmi

Abstract:

From the earliest light microscope to the most innovative X-ray imaging techniques, all of them have refined and improved our knowledge about the organization and composition of living tissues. The old techniques are time consuming and ultimately destructive to the tissues under the examination. In recent few decades, thanks to the boost of technology, non-destructive visualization techniques, such as X-ray computed tomography (CT), magnetic resonance imaging (MRI), selective plane illumination microscopy (SPIM), and optical projection tomography (OPT), have come to the forefront. Among these techniques, CT is excellent for mineralized tissues such as bone or dentine. In addition, CT it is faster than other aforementioned techniques and the sample remains intact. In this article, applications, advantages, and limitations of micro-CT is discussed, in addition to some information about micro-CT of soft tissue.

Keywords: Micro-CT, hard tissue, bone, attenuation coefficient, rapid prototyping

Procedia PDF Downloads 142
47750 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data

Authors: S. Nickolas, Shobha K.

Abstract:

The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.

Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing

Procedia PDF Downloads 274
47749 Analysis of Supply Chain Risk Management Strategies: Case Study of Supply Chain Disruptions

Authors: Marcelo Dias Carvalho, Leticia Ishikawa

Abstract:

Supply Chain Risk Management refers to a set of strategies used by companies to avoid supply chain disruption caused by damage at production facilities, natural disasters, capacity issues, inventory problems, incorrect forecasts, and delays. Many companies use the techniques of the Toyota Production System, which in a way goes against a better management of supply chain risks. This paper studies key events in some multinationals to analyze the trade-off between the best supply chain risk management techniques and management policies designed to create lean enterprises. The result of a good balance of these actions is the reduction of losses, increased customer trust in the company and better preparedness to face the general risks of a supply chain.

Keywords: just in time, lean manufacturing, supply chain disruptions, supply chain management

Procedia PDF Downloads 338
47748 Disparities in the Levels of Economic Development in Uttar Pradesh: A Regional Analysis

Authors: Naushaba Naseem Ahmed

Abstract:

Economic development does not merely depend upon the level of development but also on its distributive aspect. As it is a serious issue, the fruit of development is not equally distributed among the different section of peoples and different part of the country this cause the regional disparities in the levels of social economic development. Different part of the country has different resource endowments in term of natural, human and capital. If there is the uniform condition to grow, these areas that have better resources, are favourably placed grow comparatively faster as other areas. Thus with the very stage of development, gap between resourceful and less resourceful area goes on widening. This paper is an attempt to highlight the levels of disparities in term of economic development with the help of selected variables. Principal component analysis, correlation, and coefficient of variation are the techniques which were used in paper and employed published data for analysis. The result shows that Western region of Uttar Pradesh is more developed followed by Central Region. There will be urgent need in investment and developmental policies for the backward region like Bundelkhand region of Uttar Pradesh.

Keywords: coefficient of variation, correlation, economic development, principal component analysis

Procedia PDF Downloads 261
47747 Uncertain Time-Cost Trade off Problems of Construction Projects Using Fuzzy Set Theory

Authors: V. S. S. Kumar, B. Vikram

Abstract:

The development of effective decision support tools that adopted in the construction industry is vital in the world we live in today, since it can lead to substantial cost reduction and efficient resource consumption. Solving the time-cost trade off problems and its related variants is at the heart of scientific research for optimizing construction planning problems. In general, the classical optimization techniques have difficulties in dealing with TCT problems. One of the main reasons of their failure is that they can easily be entrapped in local minima. This paper presents an investigation on the application of meta-heuristic techniques to two particular variants of the time-cost trade of analysis, the time-cost trade off problem (TCT), and time-cost trade off optimization problem (TCO). In first problem, the total project cost should be minimized, and in the second problem, the total project cost and total project duration should be minimized simultaneously. Finally it is expected that, the optimization models developed in this paper will contribute significantly for efficient planning and management of construction project.

Keywords: fuzzy sets, uncertainty, optimization, time cost trade off problems

Procedia PDF Downloads 356
47746 The Economics of Ecosystem Services and Biodiversity: Valuing Ecotourism-Local Perspectives to Global Discourses-Stakeholders’ Analysis

Authors: Diptimayee Nayak

Abstract:

Ecotourism has been recognised as a popular component of alternative tourism, which claims to guard host local environment and economy. This concept of ecological tourism (eco-tourism) has become more meaningful in evaluating the recreational function and services of any pristine ecosystem in context of ‘The Economics of Ecosystem and Biodiversity (TEEB)’. This ecotourism is said to be a local solution to the global problem of conserving ecosystems and optimising the utilisations of their services. This paper takes a case of recreational services of an Indian protected area ecosystems ‘Bhitarakanika mangrove protected area’ discussing how ecotourism is functioning taking the perspectives of different stakeholders. Specific stakeholders are taken for analysis, viz., tourists and local people, as they are believed to be the major beneficiaries of ecotourism. The stakeholders’ analysis is evaluated on the basis of travel cost techniques (by using truncated Poisson distribution model) for tourists and descriptive and analytical tools for local people. The evaluation of stakeholders’ analysis of ecotourism has gained its impetus after the formulation of Ecotourism guidelines by the Ministry of Environment and Forest (MoEF), Government of India. The paper concludes that ecotourism issues and challenges are site-specific and region-specific; without critically focussing challenges of ecotourism faced at local level the discourses of ecotourism at global level cannot be tackled. Mere integration and replication of policies at global level to be followed at local level will not be successful (top down policies). Rather mainstreaming the decision making process at local level with the global policy stature helps to solve global issues to a bigger extent (bottom up).

Keywords: ecosystem services, ecotourism, TEEB, economic valuation, stakeholders, travel cost techniques

Procedia PDF Downloads 248
47745 Extraction of Essential Oil From Orange Peels

Authors: Aayush Bhisikar, Neha Rajas, Aditya Bhingare, Samarth Bhandare, Amruta Amrurkar

Abstract:

Orange peels are currently thrown away as garbage in India after orange fruits' edible components are consumed. However, the nation depends on important essential oils for usage in companies that produce goods, including food, beverages, cosmetics, and medicines. This study was conducted to show how to effectively use it. By using various extraction techniques, orange peel is used in the creation of essential oils. Stream distillation, water distillation, and solvent extraction were the techniques taken into consideration in this paper. Due to its relative prevalence among the extraction techniques, Design Expert 7.0 was used to plan an experimental run for solvent extraction. Oil was examined to ascertain its physical and chemical characteristics after extraction. It was determined from the outcomes that the orange peels.

Keywords: orange peels, extraction, essential oil, distillation

Procedia PDF Downloads 87
47744 Extraction of Essential Oil from Orange Peels

Authors: Neha Rajas, Aayush Bhisikar, Samarth Bhandare, Aditya Bhingare, Amruta Amrutkar

Abstract:

Orange peels are currently thrown away as garbage in India after orange fruits' edible components are consumed. However, the nation depends on important essential oils for usage in companies that produce goods, including food, beverages, cosmetics, and medicines. This study was conducted to show how to effectively use it. By using various extraction techniques, orange peel is used in the creation of essential oils. Stream distillation, water distillation, and solvent extraction were the techniques taken into consideration in this paper. Due to its relative prevalence among the extraction techniques, Design Expert 7.0 was used to plan an experimental run for solvent extraction. Oil was examined to ascertain its physical and chemical characteristics after extraction. It was determined from the outcomes that the orange peels.

Keywords: orange peels, extraction, distillation, essential oil

Procedia PDF Downloads 80
47743 What the Future Holds for Social Media Data Analysis

Authors: P. Wlodarczak, J. Soar, M. Ally

Abstract:

The dramatic rise in the use of Social Media (SM) platforms such as Facebook and Twitter provide access to an unprecedented amount of user data. Users may post reviews on products and services they bought, write about their interests, share ideas or give their opinions and views on political issues. There is a growing interest in the analysis of SM data from organisations for detecting new trends, obtaining user opinions on their products and services or finding out about their online reputations. A recent research trend in SM analysis is making predictions based on sentiment analysis of SM. Often indicators of historic SM data are represented as time series and correlated with a variety of real world phenomena like the outcome of elections, the development of financial indicators, box office revenue and disease outbreaks. This paper examines the current state of research in the area of SM mining and predictive analysis and gives an overview of the analysis methods using opinion mining and machine learning techniques.

Keywords: social media, text mining, knowledge discovery, predictive analysis, machine learning

Procedia PDF Downloads 423
47742 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm

Procedia PDF Downloads 142
47741 Analysis and Performance of Handover in Universal Mobile Telecommunications System (UMTS) Network Using OPNET Modeller

Authors: Latif Adnane, Benaatou Wafa, Pla Vicent

Abstract:

Handover is of great significance to achieve seamless connectivity in wireless networks. This paper gives an impression of the main factors which are being affected by the soft and the hard handovers techniques. To know and understand the handover process in The Universal Mobile Telecommunications System (UMTS) network, different statistics are calculated. This paper focuses on the quality of service (QoS) of soft and hard handover in UMTS network, which includes the analysis of received power, signal to noise radio, throughput, delay traffic, traffic received, delay, total transmit load, end to end delay and upload response time using OPNET simulator.

Keywords: handover, UMTS, mobility, simulation, OPNET modeler

Procedia PDF Downloads 321
47740 Parameter Estimation with Uncertainty and Sensitivity Analysis for the SARS Outbreak in Hong Kong

Authors: Afia Naheed, Manmohan Singh, David Lucy

Abstract:

This work is based on a mathematical as well as statistical study of an SEIJTR deterministic model for the interpretation of transmission of severe acute respiratory syndrome (SARS). Based on the SARS epidemic in 2003, the parameters are estimated using Runge-Kutta (Dormand-Prince pairs) and least squares methods. Possible graphical and numerical techniques are used to validate the estimates. Then effect of the model parameters on the dynamics of the disease is examined using sensitivity and uncertainty analysis. Sensitivity and uncertainty analytical techniques are used in order to analyze the affect of the uncertainty in the obtained parameter estimates and to determine which parameters have the largest impact on controlling the disease dynamics.

Keywords: infectious disease, severe acute respiratory syndrome (SARS), parameter estimation, sensitivity analysis, uncertainty analysis, Runge-Kutta methods, Levenberg-Marquardt method

Procedia PDF Downloads 361
47739 Performance Comparison of Outlier Detection Techniques Based Classification in Wireless Sensor Networks

Authors: Ayadi Aya, Ghorbel Oussama, M. Obeid Abdulfattah, Abid Mohamed

Abstract:

Nowadays, many wireless sensor networks have been distributed in the real world to collect valuable raw sensed data. The challenge is to extract high-level knowledge from this huge amount of data. However, the identification of outliers can lead to the discovery of useful and meaningful knowledge. In the field of wireless sensor networks, an outlier is defined as a measurement that deviates from the normal behavior of sensed data. Many detection techniques of outliers in WSNs have been extensively studied in the past decade and have focused on classic based algorithms. These techniques identify outlier in the real transaction dataset. This survey aims at providing a structured and comprehensive overview of the existing researches on classification based outlier detection techniques as applicable to WSNs. Thus, we have identified key hypotheses, which are used by these approaches to differentiate between normal and outlier behavior. In addition, this paper tries to provide an easier and a succinct understanding of the classification based techniques. Furthermore, we identified the advantages and disadvantages of different classification based techniques and we presented a comparative guide with useful paradigms for promoting outliers detection research in various WSN applications and suggested further opportunities for future research.

Keywords: bayesian networks, classification-based approaches, KPCA, neural networks, one-class SVM, outlier detection, wireless sensor networks

Procedia PDF Downloads 496
47738 Mathematical Programming Models for Portfolio Optimization Problem: A Review

Authors: Mazura Mokhtar, Adibah Shuib, Daud Mohamad

Abstract:

Portfolio optimization problem has received a lot of attention from both researchers and practitioners over the last six decades. This paper provides an overview of the current state of research in portfolio optimization with the support of mathematical programming techniques. On top of that, this paper also surveys the solution algorithms for solving portfolio optimization models classifying them according to their nature in heuristic and exact methods. To serve these purposes, 40 related articles appearing in the international journal from 2003 to 2013 have been gathered and analyzed. Based on the literature review, it has been observed that stochastic programming and goal programming constitute the highest number of mathematical programming techniques employed to tackle the portfolio optimization problem. It is hoped that the paper can meet the needs of researchers and practitioners for easy references of portfolio optimization.

Keywords: portfolio optimization, mathematical programming, multi-objective programming, solution approaches

Procedia PDF Downloads 348
47737 Sentiment Analysis: An Enhancement of Ontological-Based Features Extraction Techniques and Word Equations

Authors: Mohd Ridzwan Yaakub, Muhammad Iqbal Abu Latiffi

Abstract:

Online business has become popular recently due to the massive amount of information and medium available on the Internet. This has resulted in the huge number of reviews where the consumers share their opinion, criticisms, and satisfaction on the products they have purchased on the websites or the social media such as Facebook and Twitter. However, to analyze customer’s behavior has become very important for organizations to find new market trends and insights. The reviews from the websites or the social media are in structured and unstructured data that need a sentiment analysis approach in analyzing customer’s review. In this article, techniques used in will be defined. Definition of the ontology and description of its possible usage in sentiment analysis will be defined. It will lead to empirical research that related to mobile phones used in research and the ontology used in the experiment. The researcher also will explore the role of preprocessing data and feature selection methodology. As the result, ontology-based approach in sentiment analysis can help in achieving high accuracy for the classification task.

Keywords: feature selection, ontology, opinion, preprocessing data, sentiment analysis

Procedia PDF Downloads 200
47736 A Hybrid Method for Determination of Effective Poles Using Clustering Dominant Pole Algorithm

Authors: Anuj Abraham, N. Pappa, Daniel Honc, Rahul Sharma

Abstract:

In this paper, an analysis of some model order reduction techniques is presented. A new hybrid algorithm for model order reduction of linear time invariant systems is compared with the conventional techniques namely Balanced Truncation, Hankel Norm reduction and Dominant Pole Algorithm (DPA). The proposed hybrid algorithm is known as Clustering Dominant Pole Algorithm (CDPA) is able to compute the full set of dominant poles and its cluster center efficiently. The dominant poles of a transfer function are specific eigenvalues of the state space matrix of the corresponding dynamical system. The effectiveness of this novel technique is shown through the simulation results.

Keywords: balanced truncation, clustering, dominant pole, Hankel norm, model reduction

Procedia PDF Downloads 599
47735 Identification of Promising Infant Clusters to Obtain Improved Block Layout Designs

Authors: Mustahsan Mir, Ahmed Hassanin, Mohammed A. Al-Saleh

Abstract:

The layout optimization of building blocks of unequal areas has applications in many disciplines including VLSI floorplanning, macrocell placement, unequal-area facilities layout optimization, and plant or machine layout design. A number of heuristics and some analytical and hybrid techniques have been published to solve this problem. This paper presents an efficient high-quality building-block layout design technique especially suited for solving large-size problems. The higher efficiency and improved quality of optimized solutions are made possible by introducing the concept of Promising Infant Clusters in a constructive placement procedure. The results presented in the paper demonstrate the improved performance of the presented technique for benchmark problems in comparison with published heuristic, analytic, and hybrid techniques.

Keywords: block layout problem, building-block layout design, CAD, optimization, search techniques

Procedia PDF Downloads 386
47734 Dynamic Analysis of Viscoelastic Plates with Variable Thickness

Authors: Gülçin Tekin, Fethi Kadıoğlu

Abstract:

In this study, the dynamic analysis of viscoelastic plates with variable thickness is examined. The solutions of dynamic response of viscoelastic thin plates with variable thickness have been obtained by using the functional analysis method in the conjunction with the Gâteaux differential. The four-node serendipity element with four degrees of freedom such as deflection, bending, and twisting moments at each node is used. Additionally, boundary condition terms are included in the functional by using a systematic way. In viscoelastic modeling, Three-parameter Kelvin solid model is employed. The solutions obtained in the Laplace-Carson domain are transformed to the real time domain by using MDOP, Dubner & Abate, and Durbin inverse transform techniques. To test the performance of the proposed mixed finite element formulation, numerical examples are treated.

Keywords: dynamic analysis, inverse laplace transform techniques, mixed finite element formulation, viscoelastic plate with variable thickness

Procedia PDF Downloads 331
47733 On Mathematical Modelling and Optimization of Emerging Trends Processes in Advanced Manufacturing

Authors: Agarana Michael C., Akinlabi Esther T., Pule Kholopane

Abstract:

Innovation in manufacturing process technologies and associated product design affects the prospects for manufacturing today and in near future. In this study some theoretical methods, useful as tools in advanced manufacturing, are considered. In particular, some basic Mathematical, Operational Research, Heuristic, and Statistical techniques are discussed. These techniques/methods are very handy in many areas of advanced manufacturing processes, including process planning optimization, modelling and analysis. Generally the production rate requires the application of Mathematical methods. The Emerging Trends Processes in Advanced Manufacturing can be enhanced by using Mathematical Modelling and Optimization techniques.

Keywords: mathematical modelling, optimization, emerging trends, advanced manufacturing

Procedia PDF Downloads 295
47732 Cost Reduction Techniques for Provision of Shelter to Homeless

Authors: Mukul Anand

Abstract:

Quality oriented affordable shelter for all has always been the key issue in the housing sector of our country. Homelessness is the acute form of housing need. It is a paradox that in spite of innumerable government initiated programmes for affordable housing, certain section of society is still devoid of shelter. About nineteen million (18.78 million) households grapple with housing shortage in Urban India in 2012. In Indian scenario there is major mismatch between the people for whom the houses are being built and those who need them. The prime force faced by public authorities in facilitation of quality housing for all is high cost of construction. The present paper will comprehend executable techniques for dilution of cost factor in housing the homeless. The key actors responsible for delivery of cheap housing stock such as capacity building, resource optimization, innovative low cost building material and indigenous skeleton housing system will also be incorporated in developing these techniques. Time performance, which is an important angle of above actors, will also be explored so as to increase the effectiveness of low cost housing. Along with this best practices will be taken up as case studies where both conventional techniques of housing and innovative low cost housing techniques would be cited. Transportation consists of approximately 30% of total construction budget. Thus use of alternative local solutions depending upon the region would be covered so as to highlight major components of low cost housing. Government is laid back regarding base line information on use of innovative low cost method and technique of resource optimization. Therefore, the paper would be an attempt to bring to light simpler solutions for achieving low cost housing.

Keywords: construction, cost, housing, optimization, shelter

Procedia PDF Downloads 445
47731 Platform Urbanism: Planning towards Hyper-Personalisation

Authors: Provides Ng

Abstract:

Platform economy is a peer-to-peer model of distributing resources facilitated by community-based digital platforms. In recent years, digital platforms are rapidly reconfiguring the public realm using hyper-personalisation techniques. This paper aims at investigating how urban planning can leapfrog into the digital age to help relieve the rising tension of the global issue of labour flow; it discusses the means to transfer techniques of hyper-personalisation into urban planning for plasticity using platform technologies. This research first denotes the limitations of the current system of urban residency, where the system maintains itself on the circulation of documents, which are data on paper. Then, this paper tabulates how some of the institutions around the world, both public and private, digitise data, and streamline communications between a network of systems and citizens using platform technologies. Subsequently, this paper proposes ways in which hyper-personalisation can be utilised to form a digital planning platform. Finally, this paper concludes by reviewing how the proposed strategy may help to open up new ways of thinking about how we affiliate ourselves with cities.

Keywords: platform urbanism, hyper-personalisation, digital inventory, urban accessibility

Procedia PDF Downloads 114
47730 A Survey of Feature Selection and Feature Extraction Techniques in Machine Learning

Authors: Samina Khalid, Shamila Nasreen

Abstract:

Dimensionality reduction as a preprocessing step to machine learning is effective in removing irrelevant and redundant data, increasing learning accuracy, and improving result comprehensibility. However, the recent increase of dimensionality of data poses a severe challenge to many existing feature selection and feature extraction methods with respect to efficiency and effectiveness. In the field of machine learning and pattern recognition, dimensionality reduction is important area, where many approaches have been proposed. In this paper, some widely used feature selection and feature extraction techniques have analyzed with the purpose of how effectively these techniques can be used to achieve high performance of learning algorithms that ultimately improves predictive accuracy of classifier. An endeavor to analyze dimensionality reduction techniques briefly with the purpose to investigate strengths and weaknesses of some widely used dimensionality reduction methods is presented.

Keywords: age related macular degeneration, feature selection feature subset selection feature extraction/transformation, FSA’s, relief, correlation based method, PCA, ICA

Procedia PDF Downloads 496