Search results for: Knowledge based system
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16920

Search results for: Knowledge based system

11880 The Effect of Measurement Distribution on System Identification and Detection of Behavior of Nonlinearities of Data

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

In this paper, we considered and applied parametric modeling for some experimental data of dynamical system. In this study, we investigated the different distribution of output measurement from some dynamical systems. Also, with variance processing in experimental data we obtained the region of nonlinearity in experimental data and then identification of output section is applied in different situation and data distribution. Finally, the effect of the spanning the measurement such as variance to identification and limitation of this approach is explained.

Keywords: Gaussian process, Nonlinearity distribution, Particle filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1704
11879 Application of Geo-Informatic Technology in Studying of Land Tenure and Land Use for Cultivation of Cash Crops by Local Communities in the Local Administration Organizations of Phailuang and Maepoon in Lublae District, Uttaradit Province

Authors: Kunchit Pirapake

Abstract:

Application of Geo-Informatic technology in land tenure and land use on the economic crop area, to create sustainable land, access to the area, and produce sustainable food for the demand of its people in the community. The research objectives are to 1) apply Geo-Informatic Technology on land ownership and agricultural land use (cash crops) in the research area, 2) create GIS database on land ownership and land use, 3) create database of an online Geoinformation system on land tenure and land use. The results of this study reveal that, first; the study area is on high slope, mountains and valleys. The land is mainly in the forest zone which was included in the Forest Act 1941 and National Conserved Forest 1964. Residents gained the rights to exploit the land passed down from their ancestors. The practice was recognized by communities. The land was suitable for cultivating a wide variety of economic crops that was the main income of the family. At present the local residents keep expanding the land to grow cash crops. Second; creating a database of the geographic information system consisted of the area range, announcement from the Interior Ministry, interpretation of satellite images, transportation routes, waterways, plots of land with a title deed available at the provincial land office. Most pieces of land without a title deed are located in the forest and national reserve areas. Data were created from a field study and a land zone determined by a GPS. Last; an online Geo-Informatic System can show the information of land tenure and land use of each economic crop. Satellite data with high resolution which could be updated and checked on the online Geo-Informatic System simultaneously.

Keywords: Geo-Informatic Technology, Land Tenure, Online Geo-Informatic System, Land Use of cash crops.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1445
11878 An AI-Based Dynamical Resource Allocation Calculation Algorithm for Unmanned Aerial Vehicle

Authors: Zhou Luchen, Wu Yubing, Burra Venkata Durga Kumar

Abstract:

As the scale of the network becomes larger and more complex than before, the density of user devices is also increasing. The development of Unmanned Aerial Vehicle (UAV) networks is able to collect and transform data in an efficient way by using software-defined networks (SDN) technology. This paper proposed a three-layer distributed and dynamic cluster architecture to manage UAVs by using an AI-based resource allocation calculation algorithm to address the overloading network problem. Through separating services of each UAV, the UAV hierarchical cluster system performs the main function of reducing the network load and transferring user requests, with three sub-tasks including data collection, communication channel organization, and data relaying. In this cluster, a head node and a vice head node UAV are selected considering the CPU, RAM, and ROM memory of devices, battery charge, and capacity. The vice head node acts as a backup that stores all the data in the head node. The k-means clustering algorithm is used in order to detect high load regions and form the UAV layered clusters. The whole process of detecting high load areas, forming and selecting UAV clusters, and moving the selected UAV cluster to that area is proposed as offloading traffic algorithm.

Keywords: k-means, resource allocation, SDN, UAV network, unmanned aerial vehicles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 307
11877 On-Road Text Detection Platform for Driver Assistance Systems

Authors: Guezouli Larbi, Belkacem Soundes

Abstract:

The automation of the text detection process can help the human in his driving task. Its application can be very useful to help drivers to have more information about their environment by facilitating the reading of road signs such as directional signs, events, stores, etc. In this paper, a system consisting of two stages has been proposed. In the first one, we used pseudo-Zernike moments to pinpoint areas of the image that may contain text. The architecture of this part is based on three main steps, region of interest (ROI) detection, text localization, and non-text region filtering. Then, in the second step, we present a convolutional neural network architecture (On-Road Text Detection Network - ORTDN) which is considered as a classification phase. The results show that the proposed framework achieved ≈ 35 fps and an mAP of ≈ 90%, thus a low computational time with competitive accuracy.

Keywords: Text detection, CNN, PZM, deep learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 123
11876 Biogas Production from Waste using Biofilm Reactor: Factor Analysis in Two Stages System

Authors: N. Zainol, J. Salihon, R. Abdul-Rahman

Abstract:

Factor analysis was applied to two stages biogas production from banana stem waste allowing a screening of the experimental variables second stage temperature (T), organic loading rates (OLR) and hydraulic retention times (HRT). Biogas production was found to be strongly influenced by all the above experimental variables. Results from factorial analysis have shown that all variables which were HRT, OLR and T have significant effect to biogas production. Increased in HRT and OLR could increased the biogas yield. The performance was tested under the conditions of various T (35oC-60oC), OLR (0.3 g TS/l.d–1.9 gTS/l.d), and HRT (3 d–15 d). Conditions for temperature, OLR and HRT in this study were based on the best range obtained from literature review.

Keywords: Biogas, factor analysis, banana stem waste

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720
11875 An Adequate Choice of Initial Sample Size for Selection Approach

Authors: Mohammad H. Almomani, Rosmanjawati Abdul Rahman

Abstract:

In this paper, we consider the effect of the initial sample size on the performance of a sequential approach that used in selecting a good enough simulated system, when the number of alternatives is very large. We implement a sequential approach on M=M=1 queuing system under some parameter settings, with a different choice of the initial sample sizes to explore the impacts on the performance of this approach. The results show that the choice of the initial sample size does affect the performance of our selection approach.

Keywords: Ranking and Selection, Ordinal Optimization, Optimal Computing Budget Allocation, Subset Selection, Indifference-Zone, Initial Sample Size.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1281
11874 Fuzzy Based Environmental System Approach for Impact Assessment - Case Studies

Authors: Marius Pislaru, Alexandru F. Trandabat

Abstract:

Environmental studies have expanded dramatically all over the world in the past few years. Nowadays businesses interact with society and the environment in ways that put their mark on both sides. Efforts improving human standard living, through the control of nature and the development of new products, have also resulted in contamination of the environment. Consequently companies play an important role in environmental sustainability of a region or country. Therefore we can say that a company's sustainable development is strictly dependent on the environment. This article presents a fuzzy model to evaluate a company's environmental impact. Article illustrates an example of the automotive industry in order to prove the usefulness of using such a model.

Keywords: fuzzy approach, environmental impact assessment, sustainability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1939
11873 An Improved Limited Tolerance Rough Set Model

Authors: Chen Wu, Komal Narejo, Dandan Li

Abstract:

Some extended rough set models in incomplete information system cannot distinguish the two objects that have few known attributes and more unknown attributes; some cannot make a flexible and accurate discrimination. In order to solve this problem, this paper suggests an improved limited tolerance rough set model using two thresholds to control what two objects have a relationship between them in limited tolerance relation and to classify objects. Our practical study case shows the model can get fine and reasonable decision results.

Keywords: Decision rule, incomplete information system, limited tolerance relation, rough set model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1169
11872 Management of Cultural Heritage: Bologna Gates

Authors: A. Ippolito, C. Bartolomei

Abstract:

A growing demand is felt today for realistic 3D models enabling the cognition and popularization of historical-artistic heritage. Evaluation and preservation of Cultural Heritage is inextricably connected with the innovative processes of gaining, managing, and using knowledge. The development and perfecting of techniques for acquiring and elaborating photorealistic 3D models, made them pivotal elements for popularizing information of objects on the scale of architectonic structures.

Keywords: Cultural heritage, databases, non-contact survey, 2D- 3D models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2232
11871 The Application of FSI Techniques in Modeling of Realist Pulmonary Systems

Authors: Abdurrahim Bolukbasi, Hassan Athari, Dogan Ciloglu

Abstract:

The modeling lung respiratory system that has complex anatomy and biophysics presents several challenges including tissue-driven flow patterns and wall motion. Also, the pulmonary lung system because of that they stretch and recoil with each breath, has not static walls and structures. The direct relationship between air flow and tissue motion in the lung structures naturally prefers an FSI simulation technique. Therefore, in order to toward the realistic simulation of pulmonary breathing mechanics the development of a coupled FSI computational model is an important step. A simple but physiologically relevant three-dimensional deep long geometry is designed and fluid-structure interaction (FSI) coupling technique is utilized for simulating the deformation of the lung parenchyma tissue that produces airflow fields. The real understanding of respiratory tissue system as a complex phenomenon have been investigated with respect to respiratory patterns, fluid dynamics and tissue viscoelasticity and tidal breathing period. 

Keywords: Lung deformation and mechanics, tissue mechanics, viscoelasticity, fluid-structure interactions, ANSYS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2312
11870 A Virtual Simulation Environment for a Design and Verification of a GPGPU

Authors: Kwang Y. Lee, Tae R. Park, Jae C. Kwak, Yong S. Koo

Abstract:

When a small H/W IP is designed, we can develop an appropriate verification environment by observing the simulated signal waves, or using the serial test vectors for the fixed output. In the case of design and verification of a massive parallel processor with multiple IPs, it-s difficult to make a verification system with existing common verification environment, and to verify each partial IP. A TestDrive verification environment can build easy and reliable verification system that can produce highly intuitive results by applying Modelsim and SystemVerilog-s DPI. It shows many advantages, for example a high-level design of a GPGPU processor design can be migrate to FPGA board immediately.

Keywords: Virtual Simulation, Verification, IP Design, GPGPU

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1643
11869 Efficient Aggregate Signature Algorithm and Its Application in MANET

Authors: Daxing Wang, Jikai Teng

Abstract:

An aggregate signature scheme can aggregate n signatures on n distinct messages from n distinct signers into a single signature. Thus, n verification equations can be reduced to one. So the aggregate signature adapts to Mobile Ad hoc Network (MANET). In this paper, we propose an efficient ID-based aggregate signature scheme with constant pairing computations. Compared with the existing ID-based aggregate signature scheme, this scheme greatly improves the efficiency of signature communication and verification. In addition, in this work, we apply our ID-based aggregate sig- nature to authenticated routing protocol to present a secure routing scheme. Our scheme not only provides sound authentication and a secure routing protocol in ad hoc networks, but also meets the nature of MANET.

Keywords: Identity-based cryptography, Aggregate signature, Bilinear pairings, Authenticated routing scheme.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2085
11868 A Dynamic Composition of an Adaptive Course

Authors: S. Chiali, Z.Eberrichi, M.Malki

Abstract:

The number of framework conceived for e-learning constantly increase, unfortunately the creators of learning materials and educational institutions engaged in e-formation adopt a “proprietor" approach, where the developed products (courses, activities, exercises, etc.) can be exploited only in the framework where they were conceived, their uses in the other learning environments requires a greedy adaptation in terms of time and effort. Each one proposes courses whose organization, contents, modes of interaction and presentations are unique for all learners, unfortunately the latter are heterogeneous and are not interested by the same information, but only by services or documents adapted to their needs. Currently the new tendency for the framework conceived for e-learning, is the interoperability of learning materials, several standards exist (DCMI (Dublin Core Metadata Initiative)[2], LOM (Learning Objects Meta data)[1], SCORM (Shareable Content Object Reference Model)[6][7][8], ARIADNE (Alliance of Remote Instructional Authoring and Distribution Networks for Europe)[9], CANCORE (Canadian Core Learning Resource Metadata Application Profiles)[3]), they converge all to the idea of learning objects. They are also interested in the adaptation of the learning materials according to the learners- profile. This article proposes an approach for the composition of courses adapted to the various profiles (knowledge, preferences, objectives) of learners, based on two ontologies (domain to teach and educational) and the learning objects.

Keywords: Adaptive educational hypermedia systems (AEHS), E-learning, Learner's model, Learning objects, Metadata, Ontology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1945
11867 Analysis and Research of Two-Level Scheduling Profile for Open Real-Time System

Authors: Yongxian Jin, Jingzhou Huang

Abstract:

In an open real-time system environment, the coexistence of different kinds of real-time and non real-time applications makes the system scheduling mechanism face new requirements and challenges. One two-level scheduling scheme of the open real-time systems is introduced, and points out that hard and soft real-time applications are scheduled non-distinctively as the same type real-time applications, the Quality of Service (QoS) cannot be guaranteed. It has two flaws: The first, it can not differentiate scheduling priorities of hard and soft real-time applications, that is to say, it neglects characteristic differences between hard real-time applications and soft ones, so it does not suit a more complex real-time environment. The second, the worst case execution time of soft real-time applications cannot be predicted exactly, so it is not worth while to cost much spending in order to assure all soft real-time applications not to miss their deadlines, and doing that may cause resource wasting. In order to solve this problem, a novel two-level real-time scheduling mechanism (including scheduling profile and scheduling algorithm) which adds the process of dealing with soft real-time applications is proposed. Finally, we verify real-time scheduling mechanism from two aspects of theory and experiment. The results indicate that our scheduling mechanism can achieve the following objectives. (1) It can reflect the difference of priority when scheduling hard and soft real-time applications. (2) It can ensure schedulability of hard real-time applications, that is, their rate of missing deadline is 0. (3) The overall rate of missing deadline of soft real-time applications can be less than 1. (4) The deadline of a non-real-time application is not set, whereas the scheduling algorithm that server 0 S uses can avoid the “starvation" of jobs and increase QOS. By doing that, our scheduling mechanism is more compatible with different types of applications and it will be applied more widely.

Keywords: Hard real-time, two-level scheduling profile, open real-time system, non-distinctive schedule, soft real-time

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1546
11866 Comparative Analysis of the Third Generation of Research Data for Evaluation of Solar Energy Potential

Authors: Claudineia Brazil, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Rafael Haag

Abstract:

Renewable energy sources are dependent on climatic variability, so for adequate energy planning, observations of the meteorological variables are required, preferably representing long-period series. Despite the scientific and technological advances that meteorological measurement systems have undergone in the last decades, there is still a considerable lack of meteorological observations that form series of long periods. The reanalysis is a system of assimilation of data prepared using general atmospheric circulation models, based on the combination of data collected at surface stations, ocean buoys, satellites and radiosondes, allowing the production of long period data, for a wide gamma. The third generation of reanalysis data emerged in 2010, among them is the Climate Forecast System Reanalysis (CFSR) developed by the National Centers for Environmental Prediction (NCEP), these data have a spatial resolution of 0.50 x 0.50. In order to overcome these difficulties, it aims to evaluate the performance of solar radiation estimation through alternative data bases, such as data from Reanalysis and from meteorological satellites that satisfactorily meet the absence of observations of solar radiation at global and/or regional level. The results of the analysis of the solar radiation data indicated that the reanalysis data of the CFSR model presented a good performance in relation to the observed data, with determination coefficient around 0.90. Therefore, it is concluded that these data have the potential to be used as an alternative source in locations with no seasons or long series of solar radiation, important for the evaluation of solar energy potential.

Keywords: Climate, reanalysis, renewable energy, solar radiation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 878
11865 Predicting the Three Major Dimensions of the Learner-s Emotions from Brainwaves

Authors: Alicia Heraz, Claude Frasson

Abstract:

This paper investigates how the use of machine learning techniques can significantly predict the three major dimensions of learner-s emotions (pleasure, arousal and dominance) from brainwaves. This study has adopted an experimentation in which participants were exposed to a set of pictures from the International Affective Picture System (IAPS) while their electrical brain activity was recorded with an electroencephalogram (EEG). The pictures were already rated in a previous study via the affective rating system Self-Assessment Manikin (SAM) to assess the three dimensions of pleasure, arousal, and dominance. For each picture, we took the mean of these values for all subjects used in this previous study and associated them to the recorded brainwaves of the participants in our study. Correlation and regression analyses confirmed the hypothesis that brainwave measures could significantly predict emotional dimensions. This can be very useful in the case of impassive, taciturn or disabled learners. Standard classification techniques were used to assess the reliability of the automatic detection of learners- three major dimensions from the brainwaves. We discuss the results and the pertinence of such a method to assess learner-s emotions and integrate it into a brainwavesensing Intelligent Tutoring System.

Keywords: Algorithms, brainwaves, emotional dimensions, performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2181
11864 Optical and Structural Properties of a ZnS Buffer Layer Fabricated with Deposition Temperature of RF Magnetron Sputtering System

Authors: Won Song, Bo-Ra Koo, Seok Eui Choi, Yong-Taeg Oh, Dong-Chan Shin

Abstract:

Optical properties of sputter-deposited ZnS thin films were investigated as potential replacements for CBD(chemical bath deposition) CdS buffer layers in the application of CIGS solar cells. ZnS thin films were fabricated on glass substrates at RT, 150oC, 200oC, and 250oC with 50 sccm Ar gas using an RF magnetron sputtering system. The crystal structure of the thin film is found to be zinc blende (cubic) structure. Lattice parameter of ZnS is slightly larger than CdS on the plane and thus better matched with that of CIGS. Within a 400-800 nm wavelength region, the average transmittance was larger than 75%. When the deposition temperature of the thin film was increased, the blue shift phenomenon was enhanced. Band gap energy of the ZnS thin film tended to increase as the deposition temperature increased. ZnS thin film is a promising material system for the CIGS buffer layer, in terms of ease of processing, low cost, environmental friendliness, higher transparency, and electrical properties

Keywords: ZnS thin film, Buffer layer, CIGS, Solar cell.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2382
11863 VFAST TCP: A delay-based enhanced version of FAST TCP

Authors: Salem Belhaj, Moncef Tagina

Abstract:

This paper is aimed at describing a delay-based endto- end (e2e) congestion control algorithm, called Very FAST TCP (VFAST), which is an enhanced version of FAST TCP. The main idea behind this enhancement is to smoothly estimate the Round-Trip Time (RTT) based on a nonlinear filter, which eliminates throughput and queue oscillation when RTT fluctuates. In this context, an evaluation of the suggested scheme through simulation is introduced, by comparing our VFAST prototype with FAST in terms of throughput, queue behavior, fairness, stability, RTT and adaptivity to changes in network. The achieved simulation results indicate that the suggested protocol offer better performance than FAST TCP in terms of RTT estimation and throughput.

Keywords: Fast tcp, RTT, delay estimation, delay-based congestion control, high speed TCP, large bandwidth delay product.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1716
11862 Merging and Comparing Ontologies Generically

Authors: Xiuzhan Guo, Arthur Berrill, Ajinkya Kulkarni, Kostya Belezko, Min Luo

Abstract:

Ontology operations, e.g., aligning and merging, were studied and implemented extensively in different settings, such as, categorical operations, relation algebras, typed graph grammars, with different concerns. However, aligning and merging operations in the settings share some generic properties, e.g., idempotence, commutativity, associativity, and representativity, which are defined on an ontology merging system, given by a nonempty set of the ontologies concerned, a binary relation on the set of the ontologies modeling ontology aligning, and a partial binary operation on the set of the ontologies modeling ontology merging. Given an ontology repository, a finite subset of the set of the ontologies, its merging closure is the smallest subset of the set of the ontologies, which contains the repository and is closed with respect to merging. If idempotence, commutativity, associativity, and representativity properties are satisfied, then both the set of the ontologies and the merging closure of the ontology repository are partially ordered naturally by merging, the merging closure of the ontology repository is finite and can be computed, compared, and sorted efficiently, including sorting, selecting, and querying some specific elements, e.g., maximal ontologies and minimal ontologies. An ontology Valignment pair is a pair of ontology homomorphisms with a common domain. We also show that the ontology merging system, given by ontology V-alignment pairs and pushouts, satisfies idempotence, commutativity, associativity, and representativity properties so that the merging system is partially ordered and the merging closure of a given repository with respect to pushouts can be computed efficiently.

Keywords: Ontology aligning, ontology merging, merging system, poset, merging closure, ontology V-alignment pair, ontology homomorphism, ontology V-alignment pair homomorphism, pushout.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 223
11861 A Novel Digital Watermarking Technique Basedon ISB (Intermediate Significant Bit)

Authors: Akram M. Zeki, Azizah A. Manaf

Abstract:

Least Significant Bit (LSB) technique is the earliest developed technique in watermarking and it is also the most simple, direct and common technique. It essentially involves embedding the watermark by replacing the least significant bit of the image data with a bit of the watermark data. The disadvantage of LSB is that it is not robust against attacks. In this study intermediate significant bit (ISB) has been used in order to improve the robustness of the watermarking system. The aim of this model is to replace the watermarked image pixels by new pixels that can protect the watermark data against attacks and at the same time keeping the new pixels very close to the original pixels in order to protect the quality of watermarked image. The technique is based on testing the value of the watermark pixel according to the range of each bit-plane.

Keywords: Watermarking, LSB, ISB, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1689
11860 Security of Internet of Things: Challenges, Requirements and Future Directions

Authors: Amjad F. Alharbi, Bashayer A. Alotaibi, Fahd S. Alotaibi

Abstract:

The emergence of Internet of Things (IoT) technology provides capabilities for a huge number of smart devices, services and people to be communicate with each other for exchanging data and information over existing network. While as IoT is progressing, it provides many opportunities for new ways of communications as well it introduces many security and privacy threats and challenges which need to be considered for the future of IoT development. In this survey paper, an IoT security issues as threats and current challenges are summarized. The security architecture for IoT are presented from four main layers. Based on these layers, the IoT security requirements are presented to insure security in the whole system. Furthermore, some researches initiatives related to IoT security are discussed as well as the future direction for IoT security are highlighted.

Keywords: Internet of Things, IoT, IoT security challenges, IoT security requirements, IoT security architecture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1165
11859 Dynamics of Mini Hydraulic Backhoe Excavator: A Lagrange-Euler (L-E) Approach

Authors: Bhaveshkumar P. Patel, J. M. Prajapati

Abstract:

Excavators are high power machines used in the mining, agricultural and construction industry whose principal functions are digging (material removing), ground leveling and material transport operations. During the digging task there are certain unknown forces exerted by the bucket on the soil and the digging operation is repetitive in nature. Automation of the digging task can be performed by an automatically controlled excavator system, which is not only control the forces but also follow the planned digging trajectories. To develop such a controller for automated excavation, it is required to develop a dynamic model to describe the behavior of the control system during digging operation and motion of excavator with time. The presented work described a dynamic model needed for controller design and which is derived by applying Lagrange-Euler approach. The developed dynamic model is intended for further development of an automated excavation control system for light duty construction work and can be applied for heavy duty or all types of backhoe excavators.

Keywords: Backhoe excavator, controller, digging, excavation, trajectory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4428
11858 Graph Cuts Segmentation Approach Using a Patch-Based Similarity Measure Applied for Interactive CT Lung Image Segmentation

Authors: Aicha Majda, Abdelhamid El Hassani

Abstract:

Lung CT image segmentation is a prerequisite in lung CT image analysis. Most of the conventional methods need a post-processing to deal with the abnormal lung CT scans such as lung nodules or other lesions. The simplest similarity measure in the standard Graph Cuts Algorithm consists of directly comparing the pixel values of the two neighboring regions, which is not accurate because this kind of metrics is extremely sensitive to minor transformations such as noise or other artifacts problems. In this work, we propose an improved version of the standard graph cuts algorithm based on the Patch-Based similarity metric. The boundary penalty term in the graph cut algorithm is defined Based on Patch-Based similarity measurement instead of the simple intensity measurement in the standard method. The weights between each pixel and its neighboring pixels are Based on the obtained new term. The graph is then created using theses weights between its nodes. Finally, the segmentation is completed with the minimum cut/Max-Flow algorithm. Experimental results show that the proposed method is very accurate and efficient, and can directly provide explicit lung regions without any post-processing operations compared to the standard method.

Keywords: Graph cuts, lung CT scan, lung parenchyma segmentation, patch based similarity metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 720
11857 Continuous Flow Experimental Set-Up for Fouling Deposit Study

Authors: A. L. Ho, N. Ab. Aziz, F. S. Taip, M. N. Ibrahim

Abstract:

The study of the fouling deposition of pink guava juice (PGJ) is relatively new research compared to milk fouling deposit. In this work, a new experimental set-up was developed to imitate the fouling formation in heat exchanger, namely a continuous flow experimental set-up heat exchanger. The new experimental setup was operated under industrial pasteurization temperature of PGJ, which was at 93°C. While the flow rate and pasteurization period were based on the experimental capacity, which were 0.5 and 1 liter/min for the flow rate and the pasteurization period was set for 1 hour. Characterization of the fouling deposit was determined by using various methods. Microstructure of the deposits was carried out using ESEM. Proximate analyses were performed to determine the composition of moisture, fat, protein, fiber, ash and carbohydrate content. A study on the hardness and stickiness of the fouling deposit was done using a texture analyzer. The presence of seedstone in pink guava juice was also analyzed using a particle analyzer. The findings shown that seedstone from pink guava juice ranging from 168 to 200μm and carbohydrate was found to be a major composition (47.7% of fouling deposit consists of carbohydrate). Comparison between the hardness and stickiness of the deposits at two different flow rates showed that fouling deposits were harder and denser at higher flow rate. Findings from this work provide basis knowledge for further study on fouling and cleaning of PGJ.

Keywords: Pink guava juice, fouling deposit, heat exchanger.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1607
11856 Attacks and Counter Measures in BST Overlay Structure of Peer-To-Peer System

Authors: Guruprasad Khataniar, Hitesh Tahbildar, Prakriti Prava Das

Abstract:

There are various overlay structures that provide efficient and scalable solutions for point and range query in a peer-topeer network. Overlay structure based on m-Binary Search Tree (BST) is one such popular technique. It deals with the division of the tree into different key intervals and then assigning the key intervals to a BST. The popularity of the BST makes this overlay structure vulnerable to different kinds of attacks. Here we present four such possible attacks namely index poisoning attack, eclipse attack, pollution attack and syn flooding attack. The functionality of BST is affected by these attacks. We also provide different security techniques that can be applied against these attacks.

Keywords: BST, eclipse attack, index poisoning attack, pollution attack, syn flooding attack.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1609
11855 Decision-Making Strategies on Smart Dairy Farms: A Review

Authors: L. Krpalkova, N. O' Mahony, A. Carvalho, S. Campbell, G. Corkery, E. Broderick, J. Walsh

Abstract:

Farm management and operations will drastically change due to access to real-time data, real-time forecasting and tracking of physical items in combination with Internet of Things (IoT) developments to further automate farm operations. Dairy farms have embraced technological innovations and procured vast amounts of permanent data streams during the past decade; however, the integration of this information to improve the whole farm decision-making process does not exist. It is now imperative to develop a system that can collect, integrate, manage, and analyze on-farm and off-farm data in real-time for practical and relevant environmental and economic actions. The developed systems, based on machine learning and artificial intelligence, need to be connected for useful output, a better understanding of the whole farming issue and environmental impact. Evolutionary Computing (EC) can be very effective in finding the optimal combination of sets of some objects and finally, in strategy determination. The system of the future should be able to manage the dairy farm as well as an experienced dairy farm manager with a team of the best agricultural advisors. All these changes should bring resilience and sustainability to dairy farming as well as improving and maintaining good animal welfare and the quality of dairy products. This review aims to provide an insight into the state-of-the-art of big data applications and EC in relation to smart dairy farming and identify the most important research and development challenges to be addressed in the future. Smart dairy farming influences every area of management and its uptake has become a continuing trend.

Keywords: Big data, evolutionary computing, cloud, precision technologies

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 720
11854 A Proposed Hybrid Color Image Compression Based on Fractal Coding with Quadtree and Discrete Cosine Transform

Authors: Shimal Das, Dibyendu Ghoshal

Abstract:

Fractal based digital image compression is a specific technique in the field of color image. The method is best suited for irregular shape of image like snow bobs, clouds, flame of fire; tree leaves images, depending on the fact that parts of an image often resemble with other parts of the same image. This technique has drawn much attention in recent years because of very high compression ratio that can be achieved. Hybrid scheme incorporating fractal compression and speedup techniques have achieved high compression ratio compared to pure fractal compression. Fractal image compression is a lossy compression method in which selfsimilarity nature of an image is used. This technique provides high compression ratio, less encoding time and fart decoding process. In this paper, fractal compression with quad tree and DCT is proposed to compress the color image. The proposed hybrid schemes require four phases to compress the color image. First: the image is segmented and Discrete Cosine Transform is applied to each block of the segmented image. Second: the block values are scanned in a zigzag manner to prevent zero co-efficient. Third: the resulting image is partitioned as fractals by quadtree approach. Fourth: the image is compressed using Run length encoding technique.

Keywords: Fractal coding, Discrete Cosine Transform, Iterated Function System (IFS), Affine Transformation, Run length encoding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1550
11853 A New Recognition Scheme for Machine- Printed Arabic Texts based on Neural Networks

Authors: Z. Shaaban

Abstract:

This paper presents a new approach to tackle the problem of recognizing machine-printed Arabic texts. Because of the difficulty of recognizing cursive Arabic words, the text has to be normalized and segmented to be ready for the recognition stage. The new scheme for recognizing Arabic characters depends on multiple parallel neural networks classifier. The classifier has two phases. The first phase categories the input character into one of eight groups. The second phase classifies the character into one of the Arabic character classes in the group. The system achieved high recognition rate.

Keywords: Neural Networks, character recognition, feature extraction, multiple networks, Arabic text.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1459
11852 Mining Network Data for Intrusion Detection through Naïve Bayesian with Clustering

Authors: Dewan Md. Farid, Nouria Harbi, Suman Ahmmed, Md. Zahidur Rahman, Chowdhury Mofizur Rahman

Abstract:

Network security attacks are the violation of information security policy that received much attention to the computational intelligence society in the last decades. Data mining has become a very useful technique for detecting network intrusions by extracting useful knowledge from large number of network data or logs. Naïve Bayesian classifier is one of the most popular data mining algorithm for classification, which provides an optimal way to predict the class of an unknown example. It has been tested that one set of probability derived from data is not good enough to have good classification rate. In this paper, we proposed a new learning algorithm for mining network logs to detect network intrusions through naïve Bayesian classifier, which first clusters the network logs into several groups based on similarity of logs, and then calculates the prior and conditional probabilities for each group of logs. For classifying a new log, the algorithm checks in which cluster the log belongs and then use that cluster-s probability set to classify the new log. We tested the performance of our proposed algorithm by employing KDD99 benchmark network intrusion detection dataset, and the experimental results proved that it improves detection rates as well as reduces false positives for different types of network intrusions.

Keywords: Clustering, detection rate, false positive, naïveBayesian classifier, network intrusion detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5520
11851 Financial Regulations in the Process of Global Financial Crisis and Macroeconomics Impact of Basel III

Authors: M. Okan Tasar

Abstract:

Basel III (or the Third Basel Accord) is a global regulatory standard on bank capital adequacy, stress testing and market liquidity risk agreed upon by the members of the Basel Committee on Banking Supervision in 2010-2011, and scheduled to be introduced from 2013 until 2018. Basel III is a comprehensive set of reform measures. These measures aim to; (1) improve the banking sector-s ability to absorb shocks arising from financial and economic stress, whatever the source, (2) improve risk management and governance, (3) strengthen banks- transparency and disclosures. Similarly the reform target; (1) bank level or micro-prudential, regulation, which will help raise the resilience of individual banking institutions to periods of stress. (2) Macro-prudential regulations, system wide risk that can build up across the banking sector as well as the pro-cyclical implication of these risks over time. These two approaches to supervision are complementary as greater resilience at the individual bank level reduces the risk system wide shocks. Macroeconomic impact of Basel III; OECD estimates that the medium-term impact of Basel III implementation on GDP growth is in the range -0,05 percent to -0,15 percent per year. On the other hand economic output is mainly affected by an increase in bank lending spreads as banks pass a rise in banking funding costs, due to higher capital requirements, to their customers. Consequently the estimated effects on GDP growth assume no active response from monetary policy. Basel III impact on economic output could be offset by a reduction (or delayed increase) in monetary policy rates by about 30 to 80 basis points. The aim of this paper is to create a framework based on the recent regulations in order to prevent financial crises. Thus the need to overcome the global financial crisis will contribute to financial crises that may occur in the future periods. In the first part of the paper, the effects of the global crisis on the banking system examine the concept of financial regulations. In the second part; especially in the financial regulations and Basel III are analyzed. The last section in this paper explored the possible consequences of the macroeconomic impacts of Basel III.

Keywords: Banking Systems, Basel III, Financial regulation, Global Financial Crisis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2265