Search results for: algorithm integration
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5932

Search results for: algorithm integration

4252 Development of an Efficient Algorithm for Cessna Citation X Speed Optimization in Cruise

Authors: Georges Ghazi, Marc-Henry Devillers, Ruxandra M. Botez

Abstract:

Aircraft flight trajectory optimization has been identified to be a promising solution for reducing both airline costs and the aviation net carbon footprint. Nowadays, this role has been mainly attributed to the flight management system. This system is an onboard multi-purpose computer responsible for providing the crew members with the optimized flight plan from a destination to the next. To accomplish this function, the flight management system uses a variety of look-up tables to compute the optimal speed and altitude for each flight regime instantly. Because the cruise is the longest segment of a typical flight, the proposed algorithm is focused on minimizing fuel consumption for this flight phase. In this paper, a complete methodology to estimate the aircraft performance and subsequently compute the optimal speed in cruise is presented. Results showed that the obtained performance database was accurate enough to predict the flight costs associated with the cruise phase.

Keywords: Cessna Citation X, cruise speed optimization, flight cost, cost index, and golden section search

Procedia PDF Downloads 275
4251 Application of Rapidly Exploring Random Tree Star-Smart and G2 Quintic Pythagorean Hodograph Curves to the UAV Path Planning Problem

Authors: Luiz G. Véras, Felipe L. Medeiros, Lamartine F. Guimarães

Abstract:

This work approaches the automatic planning of paths for Unmanned Aerial Vehicles (UAVs) through the application of the Rapidly Exploring Random Tree Star-Smart (RRT*-Smart) algorithm. RRT*-Smart is a sampling process of positions of a navigation environment through a tree-type graph. The algorithm consists of randomly expanding a tree from an initial position (root node) until one of its branches reaches the final position of the path to be planned. The algorithm ensures the planning of the shortest path, considering the number of iterations tending to infinity. When a new node is inserted into the tree, each neighbor node of the new node is connected to it, if and only if the extension of the path between the root node and that neighbor node, with this new connection, is less than the current extension of the path between those two nodes. RRT*-smart uses an intelligent sampling strategy to plan less extensive routes by spending a smaller number of iterations. This strategy is based on the creation of samples/nodes near to the convex vertices of the navigation environment obstacles. The planned paths are smoothed through the application of the method called quintic pythagorean hodograph curves. The smoothing process converts a route into a dynamically-viable one based on the kinematic constraints of the vehicle. This smoothing method models the hodograph components of a curve with polynomials that obey the Pythagorean Theorem. Its advantage is that the obtained structure allows computation of the curve length in an exact way, without the need for quadratural techniques for the resolution of integrals.

Keywords: path planning, path smoothing, Pythagorean hodograph curve, RRT*-Smart

Procedia PDF Downloads 158
4250 Model for Introducing Products to New Customers through Decision Tree Using Algorithm C4.5 (J-48)

Authors: Komol Phaisarn, Anuphan Suttimarn, Vitchanan Keawtong, Kittisak Thongyoun, Chaiyos Jamsawang

Abstract:

This article is intended to analyze insurance information which contains information on the customer decision when purchasing life insurance pay package. The data were analyzed in order to present new customers with Life Insurance Perfect Pay package to meet new customers’ needs as much as possible. The basic data of insurance pay package were collect to get data mining; thus, reducing the scattering of information. The data were then classified in order to get decision model or decision tree using Algorithm C4.5 (J-48). In the classification, WEKA tools are used to form the model and testing datasets are used to test the decision tree for the accurate decision. The validation of this model in classifying showed that the accurate prediction was 68.43% while 31.25% were errors. The same set of data were then tested with other models, i.e. Naive Bayes and Zero R. The results showed that J-48 method could predict more accurately. So, the researcher applied the decision tree in writing the program used to introduce the product to new customers to persuade customers’ decision making in purchasing the insurance package that meets the new customers’ needs as much as possible.

Keywords: decision tree, data mining, customers, life insurance pay package

Procedia PDF Downloads 415
4249 Automatic Detection of Proliferative Cells in Immunohistochemically Images of Meningioma Using Fuzzy C-Means Clustering and HSV Color Space

Authors: Vahid Anari, Mina Bakhshi

Abstract:

Visual search and identification of immunohistochemically stained tissue of meningioma was performed manually in pathologic laboratories to detect and diagnose the cancers type of meningioma. This task is very tedious and time-consuming. Moreover, because of cell's complex nature, it still remains a challenging task to segment cells from its background and analyze them automatically. In this paper, we develop and test a computerized scheme that can automatically identify cells in microscopic images of meningioma and classify them into positive (proliferative) and negative (normal) cells. Dataset including 150 images are used to test the scheme. The scheme uses Fuzzy C-means algorithm as a color clustering method based on perceptually uniform hue, saturation, value (HSV) color space. Since the cells are distinguishable by the human eye, the accuracy and stability of the algorithm are quantitatively compared through application to a wide variety of real images.

Keywords: positive cell, color segmentation, HSV color space, immunohistochemistry, meningioma, thresholding, fuzzy c-means

Procedia PDF Downloads 193
4248 Impact on the Results of Sub-Group Analysis on Performance of Recommender Systems

Authors: Ho Yeon Park, Kyoung-Jae Kim

Abstract:

The purpose of this study is to investigate whether friendship in social media can be an important factor in recommender system through social scientific analysis of friendship in popular social media such as Facebook and Twitter. For this purpose, this study analyzes data on friendship in real social media using component analysis and clique analysis among sub-group analysis in social network analysis. In this study, we propose an algorithm to reflect the results of sub-group analysis on the recommender system. The key to this algorithm is to ensure that recommendations from users in friendships are more likely to be reflected in recommendations from users. As a result of this study, outcomes of various subgroup analyzes were derived, and it was confirmed that the results were different from the results of the existing recommender system. Therefore, it is considered that the results of the subgroup analysis affect the recommendation performance of the system. Future research will attempt to generalize the results of the research through further analysis of various social data.

Keywords: sub-group analysis, social media, social network analysis, recommender systems

Procedia PDF Downloads 347
4247 Digital Watermarking Using Fractional Transform and (k,n) Halftone Visual Cryptography (HVC)

Authors: R. Rama Kishore, Sunesh Malik

Abstract:

Development in the usage of internet for different purposes in recent times creates great threat for the copy right protection of the digital images. Digital watermarking is the best way to rescue from the said problem. This paper presents detailed review of the different watermarking techniques, latest trends in the field and categorized like spatial and transform domain, blind and non-blind methods, visible and non visible techniques etc. It also discusses the different optimization techniques used in the field of watermarking in order to improve the robustness and imperceptibility of the method. Different measures are discussed to evaluate the performance of the watermarking algorithm. At the end, this paper proposes a watermarking algorithm using (k.n) shares of halftone visual cryptography (HVC) instead of (2, 2) share cryptography. (k,n) shares visual cryptography improves the security of the watermark. As halftone is a method of reprographic, it helps in improving the visual quality of watermark image. The proposed method uses fractional transformation to improve the robustness of the copyright protection of the method.

Keywords: digital watermarking, fractional transform, halftone, visual cryptography

Procedia PDF Downloads 341
4246 Mastering Test Automation: Bridging Gaps for Seamless QA

Authors: Rohit Khankhoje

Abstract:

The rapid evolution of software development practices has given rise to an increasing demand for efficient and effective test automation. The paper titled "Mastering Test Automation: Bridging Gaps for Seamless QA" delves into the crucial aspects of test automation, addressing the obstacles faced by organizations in achieving flawless quality assurance. The paper highlights the importance of bridging knowledge gaps within organizations, emphasizing the necessity for management to acquire a deeper comprehension of test automation scenarios, coverage, report trends, and the importance of communication. To tackle these challenges, this paper introduces innovative solutions, including the development of an automation framework that seamlessly integrates with test cases and reporting tools like TestRail and Jira. This integration facilitates the automatic recording of bugs in Jira, enhancing bug reporting and communication between manual QA and automation teams as well as TestRail have all newly added automated testcases as soon as it is part of the automation suite. The paper demonstrates how this framework empowers management by providing clear insights into ongoing automation activities, bug origins, trend analysis, and test case specifics. "Mastering Test Automation" serves as a comprehensive guide for organizations aiming to enhance their quality assurance processes through effective test automation. It not only identifies the common pitfalls and challenges but also offers practical solutions to bridge the gaps, resulting in a more streamlined and efficient QA process.

Keywords: automation framework, API integration, test automation, test management tools

Procedia PDF Downloads 56
4245 Transformer Design Optimization Using Artificial Intelligence Techniques

Authors: Zakir Husain

Abstract:

Main objective of a power transformer design optimization problem requires minimizing the total overall cost and/or mass of the winding and core material by satisfying all possible constraints obligatory by the standards and transformer user requirement. The constraints include appropriate limits on winding fill factor, temperature rise, efficiency, no-load current and voltage regulation. The design optimizations tasks are a constrained minimum cost and/or mass solution by optimally setting the parameters, geometry and require magnetic properties of the transformer. In this paper, present the above design problems have been formulated by using genetic algorithm (GA) and simulated annealing (SA) on the MATLAB platform. The importance of the presented approach is stems for two main features. First, proposed technique provides reliable and efficient solution for the problem of design optimization with several variables. Second, it guaranteed to obtained solution is global optimum. This paper includes a demonstration of the application of the genetic programming GP technique to transformer design.

Keywords: optimization, power transformer, genetic algorithm (GA), simulated annealing technique (SA)

Procedia PDF Downloads 564
4244 An Effective Noise Resistant Frequency Modulation Continuous-Wave Radar Vital Sign Signal Detection Method

Authors: Lu Yang, Meiyang Song, Xiang Yu, Wenhao Zhou, Chuntao Feng

Abstract:

To address the problem that the FM continuous-wave radar (FMCW) extracts human vital sign signals which are susceptible to noise interference and low reconstruction accuracy, a new detection scheme for the sign signals is proposed. Firstly, an improved complete ensemble empirical modal decomposition with adaptive noise (ICEEMDAN) algorithm is applied to decompose the radar-extracted thoracic signals to obtain several intrinsic modal functions (IMF) with different spatial scales, and then the IMF components are optimized by a BP neural network improved by immune genetic algorithm (IGA). The simulation results show that this scheme can effectively separate the noise and accurately extract the respiratory and heartbeat signals and improve the reconstruction accuracy and signal-to-noise ratio of the sign signals.

Keywords: frequency modulated continuous wave radar, ICEEMDAN, BP neural network, vital signs signal

Procedia PDF Downloads 146
4243 Bridge Members Segmentation Algorithm of Terrestrial Laser Scanner Point Clouds Using Fuzzy Clustering Method

Authors: Donghwan Lee, Gichun Cha, Jooyoung Park, Junkyeong Kim, Seunghee Park

Abstract:

3D shape models of the existing structure are required for many purposes such as safety and operation management. The traditional 3D modeling methods are based on manual or semi-automatic reconstruction from close-range images. It occasions great expense and time consuming. The Terrestrial Laser Scanner (TLS) is a common survey technique to measure quickly and accurately a 3D shape model. This TLS is used to a construction site and cultural heritage management. However there are many limits to process a TLS point cloud, because the raw point cloud is massive volume data. So the capability of carrying out useful analyses is also limited with unstructured 3-D point. Thus, segmentation becomes an essential step whenever grouping of points with common attributes is required. In this paper, members segmentation algorithm was presented to separate a raw point cloud which includes only 3D coordinates. This paper presents a clustering approach based on a fuzzy method for this objective. The Fuzzy C-Means (FCM) is reviewed and used in combination with a similarity-driven cluster merging method. It is applied to the point cloud acquired with Lecia Scan Station C10/C5 at the test bed. The test-bed was a bridge which connects between 1st and 2nd engineering building in Sungkyunkwan University in Korea. It is about 32m long and 2m wide. This bridge was used as pedestrian between two buildings. The 3D point cloud of the test-bed was constructed by a measurement of the TLS. This data was divided by segmentation algorithm for each member. Experimental analyses of the results from the proposed unsupervised segmentation process are shown to be promising. It can be processed to manage configuration each member, because of the segmentation process of point cloud.

Keywords: fuzzy c-means (FCM), point cloud, segmentation, terrestrial laser scanner (TLS)

Procedia PDF Downloads 220
4242 Optimization of Solar Rankine Cycle by Exergy Analysis and Genetic Algorithm

Authors: R. Akbari, M. A. Ehyaei, R. Shahi Shavvon

Abstract:

Nowadays, solar energy is used for energy purposes such as the use of thermal energy for domestic, industrial and power applications, as well as the conversion of the sunlight into electricity by photovoltaic cells. In this study, the thermodynamic simulation of the solar Rankin cycle with phase change material (paraffin) was first studied. Then energy and exergy analyses were performed. For optimization, a single and multi-objective genetic optimization algorithm to maximize thermal and exergy efficiency was used. The parameters discussed in this paper included the effects of input pressure on turbines, input mass flow to turbines, the surface of converters and collector angles on thermal and exergy efficiency. In the organic Rankin cycle, where solar energy is used as input energy, the fluid selection is considered as a necessary factor to achieve reliable and efficient operation. Therefore, silicon oil is selected for a high-temperature cycle and water for a low-temperature cycle as an operating fluid. The results showed that increasing the mass flow to turbines 1 and 2 would increase thermal efficiency, while it reduces and increases the exergy efficiency in turbines 1 and 2, respectively. Increasing the inlet pressure to the turbine 1 decreases the thermal and exergy efficiency, and increasing the inlet pressure to the turbine 2 increases the thermal efficiency and exergy efficiency. Also, increasing the angle of the collector increased thermal efficiency and exergy. The thermal efficiency of the system was 22.3% which improves to 33.2 and 27.2% in single-objective and multi-objective optimization, respectively. Also, the exergy efficiency of the system was 1.33% which has been improved to 1.719 and 1.529% in single-objective and multi-objective optimization, respectively. These results showed that the thermal and exergy efficiency in a single-objective optimization is greater than the multi-objective optimization.

Keywords: exergy analysis, genetic algorithm, rankine cycle, single and multi-objective function

Procedia PDF Downloads 136
4241 Machine Vision System for Measuring the Quality of Bulk Sun-dried Organic Raisins

Authors: Navab Karimi, Tohid Alizadeh

Abstract:

An intelligent vision-based system was designed to measure the quality and purity of raisins. A machine vision setup was utilized to capture the images of bulk raisins in ranges of 5-50% mixed pure-impure berries. The textural features of bulk raisins were extracted using Grey-level Histograms, Co-occurrence Matrix, and Local Binary Pattern (a total of 108 features). Genetic Algorithm and neural network regression were used for selecting and ranking the best features (21 features). As a result, the GLCM features set was found to have the highest accuracy (92.4%) among the other sets. Followingly, multiple feature combinations of the previous stage were fed into the second regression (linear regression) to increase accuracy, wherein a combination of 16 features was found to be the optimum. Finally, a Support Vector Machine (SVM) classifier was used to differentiate the mixtures, producing the best efficiency and accuracy of 96.2% and 97.35%, respectively.

Keywords: sun-dried organic raisin, genetic algorithm, feature extraction, ann regression, linear regression, support vector machine, south azerbaijan.

Procedia PDF Downloads 59
4240 Artificial Bee Colony Based Modified Energy Efficient Predictive Routing in MANET

Authors: Akhil Dubey, Rajnesh Singh

Abstract:

In modern days there occur many rapid modifications in field of ad hoc network. These modifications create many revolutionary changes in the routing. Predictive energy efficient routing is inspired on the bee’s behavior of swarm intelligence. Predictive routing improves the efficiency of routing in the energetic point of view. The main aim of this routing is the minimum energy consumption during communication and maximized intermediate node’s remaining battery power. This routing is based on food searching behavior of bees. There are two types of bees for the exploration phase the scout bees and for the evolution phase forager bees use by this routing. This routing algorithm computes the energy consumption, fitness ratio and goodness of the path. In this paper we review the literature related with predictive routing, presenting modified routing and simulation result of this algorithm comparison with artificial bee colony based routing schemes in MANET and see the results of path fitness and probability of fitness.

Keywords: mobile ad hoc network, artificial bee colony, PEEBR, modified predictive routing

Procedia PDF Downloads 404
4239 Multidirectional Product Support System for Decision Making in Textile Industry Using Collaborative Filtering Methods

Authors: A. Senthil Kumar, V. Murali Bhaskaran

Abstract:

In the information technology ground, people are using various tools and software for their official use and personal reasons. Nowadays, people are worrying to choose data accessing and extraction tools at the time of buying and selling their products. In addition, worry about various quality factors such as price, durability, color, size, and availability of the product. The main purpose of the research study is to find solutions to these unsolved existing problems. The proposed algorithm is a Multidirectional Rank Prediction (MDRP) decision making algorithm in order to take an effective strategic decision at all the levels of data extraction, uses a real time textile dataset and analyzes the results. Finally, the results are obtained and compared with the existing measurement methods such as PCC, SLCF, and VSS. The result accuracy is higher than the existing rank prediction methods.

Keywords: Knowledge Discovery in Database (KDD), Multidirectional Rank Prediction (MDRP), Pearson’s Correlation Coefficient (PCC), VSS (Vector Space Similarity)

Procedia PDF Downloads 273
4238 Assessment of On-Site Solar and Wind Energy at a Manufacturing Facility in Ireland

Authors: A. Sgobba, C. Meskell

Abstract:

The feasibility of on-site electricity production from solar and wind and the resulting load management for a specific manufacturing plant in Ireland are assessed. The industry sector accounts directly and indirectly for a high percentage of electricity consumption and global greenhouse gas emissions; therefore, it will play a key role in emission reduction and control. Manufacturing plants, in particular, are often located in non-residential areas since they require open spaces for production machinery, parking facilities for the employees, appropriate routes for supply and delivery, special connections to the national grid and other environmental impacts. Since they have larger spaces compared to commercial sites in urban areas, they represent an appropriate case study for evaluating the technical and economic viability of energy system integration with low power density technologies, such as solar and wind, for on-site electricity generation. The available open space surrounding the analysed manufacturing plant can be efficiently used to produce a discrete quantity of energy, instantaneously and locally consumed. Therefore, transmission and distribution losses can be reduced. The usage of storage is not required due to the high and almost constant electricity consumption profile. The energy load of the plant is identified through the analysis of gas and electricity consumption, both internally monitored and reported on the bills. These data are not often recorded and available to third parties since manufacturing companies usually keep track only of the overall energy expenditures. The solar potential is modelled for a period of 21 years based on global horizontal irradiation data; the hourly direct and diffuse radiation and the energy produced by the system at the optimum pitch angle are calculated. The model is validated using PVWatts and SAM tools. Wind speed data are available for the same period within one-hour step at a height of 10m. Since the hub of a typical wind turbine reaches a higher altitude, complementary data for a different location at 50m have been compared, and a model for the estimate of wind speed at the required height in the right location is defined. Weibull Statistical Distribution is used to evaluate the wind energy potential of the site. The results show that solar and wind energy are, as expected, generally decoupled. Based on the real case study, the percentage of load covered every hour by on-site generation (Level of Autonomy LA) and the resulting electricity bought from the grid (Expected Energy Not Supplied EENS) are calculated. The economic viability of the project is assessed through Net Present Value, and the influence the main technical and economic parameters have on NPV is presented. Since the results show that the analysed renewable sources can not provide enough electricity, the integration with a cogeneration technology is studied. Finally, the benefit to energy system integration of wind, solar and a cogeneration technology is evaluated and discussed.

Keywords: demand, energy system integration, load, manufacturing, national grid, renewable energy sources

Procedia PDF Downloads 119
4237 Bitplanes Gray-Level Image Encryption Approach Using Arnold Transform

Authors: Ali Abdrhman M. Ukasha

Abstract:

Data security needed in data transmission, storage, and communication to ensure the security. The single step parallel contour extraction (SSPCE) method is used to create the edge map as a key image from the different Gray level/Binary image. Performing the X-OR operation between the key image and each bit plane of the original image for image pixel values change purpose. The Arnold transform used to changes the locations of image pixels as image scrambling process. Experiments have demonstrated that proposed algorithm can fully encrypt 2D Gary level image and completely reconstructed without any distortion. Also shown that the analyzed algorithm have extremely large security against some attacks like salt & pepper and JPEG compression. Its proof that the Gray level image can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.

Keywords: SSPCE method, image compression-salt- peppers attacks, bitplanes decomposition, Arnold transform, lossless image encryption

Procedia PDF Downloads 420
4236 Margin-Based Feed-Forward Neural Network Classifiers

Authors: Xiaohan Bookman, Xiaoyan Zhu

Abstract:

Margin-Based Principle has been proposed for a long time, it has been proved that this principle could reduce the structural risk and improve the performance in both theoretical and practical aspects. Meanwhile, feed-forward neural network is a traditional classifier, which is very hot at present with a deeper architecture. However, the training algorithm of feed-forward neural network is developed and generated from Widrow-Hoff Principle that means to minimize the squared error. In this paper, we propose a new training algorithm for feed-forward neural networks based on Margin-Based Principle, which could effectively promote the accuracy and generalization ability of neural network classifiers with less labeled samples and flexible network. We have conducted experiments on four UCI open data sets and achieved good results as expected. In conclusion, our model could handle more sparse labeled and more high-dimension data set in a high accuracy while modification from old ANN method to our method is easy and almost free of work.

Keywords: Max-Margin Principle, Feed-Forward Neural Network, classifier, structural risk

Procedia PDF Downloads 326
4235 Sensor and Actuator Fault Detection in Connected Vehicles under a Packet Dropping Network

Authors: Z. Abdollahi Biron, P. Pisu

Abstract:

Connected vehicles are one of the promising technologies for future Intelligent Transportation Systems (ITS). A connected vehicle system is essentially a set of vehicles communicating through a network to exchange their information with each other and the infrastructure. Although this interconnection of the vehicles can be potentially beneficial in creating an efficient, sustainable, and green transportation system, a set of safety and reliability challenges come out with this technology. The first challenge arises from the information loss due to unreliable communication network which affects the control/management system of the individual vehicles and the overall system. Such scenario may lead to degraded or even unsafe operation which could be potentially catastrophic. Secondly, faulty sensors and actuators can affect the individual vehicle’s safe operation and in turn will create a potentially unsafe node in the vehicular network. Further, sending that faulty sensor information to other vehicles and failure in actuators may significantly affect the safe operation of the overall vehicular network. Therefore, it is of utmost importance to take these issues into consideration while designing the control/management algorithms of the individual vehicles as a part of connected vehicle system. In this paper, we consider a connected vehicle system under Co-operative Adaptive Cruise Control (CACC) and propose a fault diagnosis scheme that deals with these aforementioned challenges. Specifically, the conventional CACC algorithm is modified by adding a Kalman filter-based estimation algorithm to suppress the effect of lost information under unreliable network. Further, a sliding mode observer-based algorithm is used to improve the sensor reliability under faults. The effectiveness of the overall diagnostic scheme is verified via simulation studies.

Keywords: fault diagnostics, communication network, connected vehicles, packet drop out, platoon

Procedia PDF Downloads 223
4234 Predicting Daily Patient Hospital Visits Using Machine Learning

Authors: Shreya Goyal

Abstract:

The study aims to build user-friendly software to understand patient arrival patterns and compute the number of potential patients who will visit a particular health facility for a given period by using a machine learning algorithm. The underlying machine learning algorithm used in this study is the Support Vector Machine (SVM). Accurate prediction of patient arrival allows hospitals to operate more effectively, providing timely and efficient care while optimizing resources and improving patient experience. It allows for better allocation of staff, equipment, and other resources. If there's a projected surge in patients, additional staff or resources can be allocated to handle the influx, preventing bottlenecks or delays in care. Understanding patient arrival patterns can also help streamline processes to minimize waiting times for patients and ensure timely access to care for patients in need. Another big advantage of using this software is adhering to strict data protection regulations such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States as the hospital will not have to share the data with any third party or upload it to the cloud because the software can read data locally from the machine. The data needs to be arranged in. a particular format and the software will be able to read the data and provide meaningful output. Using software that operates locally can facilitate compliance with these regulations by minimizing data exposure. Keeping patient data within the hospital's local systems reduces the risk of unauthorized access or breaches associated with transmitting data over networks or storing it in external servers. This can help maintain the confidentiality and integrity of sensitive patient information. Historical patient data is used in this study. The input variables used to train the model include patient age, time of day, day of the week, seasonal variations, and local events. The algorithm uses a Supervised learning method to optimize the objective function and find the global minima. The algorithm stores the values of the local minima after each iteration and at the end compares all the local minima to find the global minima. The strength of this study is the transfer function used to calculate the number of patients. The model has an output accuracy of >95%. The method proposed in this study could be used for better management planning of personnel and medical resources.

Keywords: machine learning, SVM, HIPAA, data

Procedia PDF Downloads 53
4233 Complex Network Approach to International Trade of Fossil Fuel

Authors: Semanur Soyyigit Kaya, Ercan Eren

Abstract:

Energy has a prominent role for development of nations. Countries which have energy resources also have strategic power in the international trade of energy since it is essential for all stages of production in the economy. Thus, it is important for countries to analyze the weakness and strength of the system. On the other side, it is commonly believed that international trade has complex network properties. Complex network is a tool for the analysis of complex systems with heterogeneous agents and interaction between them. A complex network consists of nodes and the interactions between these nodes. Total properties which emerge as a result of these interactions are distinct from the sum of small parts (more or less) in complex systems. Thus, standard approaches to international trade are superficial to analyze these systems. Network analysis provides a new approach to analyze international trade as a network. In this network countries constitute nodes and trade relations (export or import) constitute edges. It becomes possible to analyze international trade network in terms of high degree indicators which are specific to complex systems such as connectivity, clustering, assortativity/disassortativity, centrality, etc. In this analysis, international trade of crude oil and coal which are types of fossil fuel has been analyzed from 2005 to 2014 via network analysis. First, it has been analyzed in terms of some topological parameters such as density, transitivity, clustering etc. Afterwards, fitness to Pareto distribution has been analyzed. Finally, weighted HITS algorithm has been applied to the data as a centrality measure to determine the real prominence of countries in these trade networks. Weighted HITS algorithm is a strong tool to analyze the network by ranking countries with regards to prominence of their trade partners. We have calculated both an export centrality and an import centrality by applying w-HITS algorithm to data.

Keywords: complex network approach, fossil fuel, international trade, network theory

Procedia PDF Downloads 318
4232 A Wireless Feedback Control System as a Base of Bio-Inspired Structure System to Mitigate Vibration in Structures

Authors: Gwanghee Heo, Geonhyeok Bang, Chunggil Kim, Chinok Lee

Abstract:

This paper attempts to develop a wireless feedback control system as a primary step eventually toward a bio-inspired structure system where inanimate structure behaves like a life form autonomously. It is a standalone wireless control system which is supposed to measure externally caused structural responses, analyze structural state from acquired data, and take its own action on the basis of the analysis with an embedded logic. For an experimental examination of its effectiveness, we applied it on a model of two-span bridge and performed a wireless control test. Experimental tests have been conducted for comparison on both the wireless and the wired system under the conditions of Un-control, Passive-off, Passive-on, and Lyapunov control algorithm. By proving the congruence of the test result of the wireless feedback control system with the wired control system, its control performance was proven to be effective. Besides, it was found to be economical in energy consumption and also autonomous by means of a command algorithm embedded into it, which proves its basic capacity as a bio-inspired system.

Keywords: structural vibration control, wireless system, MR damper, feedback control, embedded system

Procedia PDF Downloads 200
4231 Detectability of Malfunction in Turboprop Engine

Authors: Tomas Vampola, Michael Valášek

Abstract:

On the basis of simulation-generated failure states of structural elements of a turboprop engine suitable for the busy-jet class of aircraft, an algorithm for early prediction of damage or reduction in functionality of structural elements of the engine is designed and verified with real data obtained at dynamometric testing facilities of aircraft engines. Based on an expanding database of experimentally determined data from temperature and pressure sensors during the operation of turboprop engines, this strategy is constantly modified with the aim of using the minimum number of sensors to detect an inadmissible or deteriorated operating mode of specific structural elements of an aircraft engine. The assembled algorithm for the early prediction of reduced functionality of the aircraft engine significantly contributes to the safety of air traffic and to a large extent, contributes to the economy of operation with positive effects on the reduction of the energy demand of operation and the elimination of adverse effects on the environment.

Keywords: detectability of malfunction, dynamometric testing, prediction of damage, turboprop engine

Procedia PDF Downloads 81
4230 A Location-Allocation-Routing Model for a Home Health Care Supply Chain Problem

Authors: Amir Mohammad Fathollahi Fard, Mostafa Hajiaghaei-Keshteli, Mohammad Mahdi Paydar

Abstract:

With increasing life expectancy in developed countries, the role of home care services is highlighted by both academia and industrial contributors in Home Health Care Supply Chain (HHCSC) companies. The main decisions in such supply chain systems are the location of pharmacies, the allocation of patients to these pharmacies and also the routing and scheduling decisions of nurses to visit their patients. In this study, for the first time, an integrated model is proposed to consist of all preliminary and necessary decisions in these companies, namely, location-allocation-routing model. This model is a type of NP-hard one. Therefore, an Imperialist Competitive Algorithm (ICA) is utilized to solve the model, especially in large sizes. Results confirm the efficiency of the developed model for HHCSC companies as well as the performance of employed ICA.

Keywords: home health care supply chain, location-allocation-routing problem, imperialist competitive algorithm, optimization

Procedia PDF Downloads 388
4229 Linear Semi Active Controller of Magneto-Rheological Damper for Seismic Vibration Attenuation

Authors: Zizouni Khaled, Fali Leyla, Sadek Younes, Bousserhane Ismail Khalil

Abstract:

In structural vibration caused principally by an earthquake excitation, the most vibration’s attenuation system used recently is the semi active control with a Magneto Rheological Damper device. This control was a subject of many researches and works in the last years. The big challenges of searchers in this case is to propose an adequate controller with a robust algorithm of current or tension adjustment. In this present paper, a linear controller is proposed to control the MR damper using to reduce a vibrations of three story structure exposed to El Centro’s 1940 and Boumerdès 2003 earthquakes. In this example, the MR damper is installed in the first floor of the structure. The numerical simulations results of the proposed linear control with a feedback law based on clipped optimal algorithm showed the feasibility of the semi active control to protecting civil structures. The comparison of the controlled structure and uncontrolled structures responses illustrate clearly the performance and the effectiveness of the simple proposed approach.

Keywords: MR damper, seismic vibration, semi-active control

Procedia PDF Downloads 271
4228 A Subband BSS Structure with Reduced Complexity and Fast Convergence

Authors: Salah Al-Din I. Badran, Samad Ahmadi, Ismail Shahin

Abstract:

A blind source separation method is proposed; in this method, we use a non-uniform filter bank and a novel normalisation. This method provides a reduced computational complexity and increased convergence speed comparing to the full-band algorithm. Recently, adaptive sub-band scheme has been recommended to solve two problems: reduction of computational complexity and increase the convergence speed of the adaptive algorithm for correlated input signals. In this work, the reduction in computational complexity is achieved with the use of adaptive filters of orders less than the full-band adaptive filters, which operate at a sampling rate lower than the sampling rate of the input signal. The decomposed signals by analysis bank filter are less correlated in each subband than the input signal at full bandwidth, and can promote better rates of convergence.

Keywords: blind source separation, computational complexity, subband, convergence speed, mixture

Procedia PDF Downloads 567
4227 Bitplanes Image Encryption/Decryption Using Edge Map (SSPCE Method) and Arnold Transform

Authors: Ali A. Ukasha

Abstract:

Data security needed in data transmission, storage, and communication to ensure the security. The single step parallel contour extraction (SSPCE) method is used to create the edge map as a key image from the different Gray level/Binary image. Performing the X-OR operation between the key image and each bit plane of the original image for image pixel values change purpose. The Arnold transform used to changes the locations of image pixels as image scrambling process. Experiments have demonstrated that proposed algorithm can fully encrypt 2D Gary level image and completely reconstructed without any distortion. Also shown that the analyzed algorithm have extremely large security against some attacks like salt & pepper and JPEG compression. Its proof that the Gray level image can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.

Keywords: SSPCE method, image compression, salt and peppers attacks, bitplanes decomposition, Arnold transform, lossless image encryption

Procedia PDF Downloads 481
4226 Developing a Deep Understanding of the Immune Response in Hepatitis B Virus Infected Patients Using a Knowledge Driven Approach

Authors: Hanan Begali, Shahi Dost, Annett Ziegler, Markus Cornberg, Maria-Esther Vidal, Anke R. M. Kraft

Abstract:

Chronic hepatitis B virus (HBV) infection can be treated with nucleot(s)ide analog (NA), for example, which inhibits HBV replication. However, they have hardly any influence on the functional cure of HBV, which is defined by hepatitis B surface antigen (HBsAg) loss. NA needs to be taken life-long, which is not available for all patients worldwide. Additionally, NA-treated patients are still at risk of developing cirrhosis, liver failure, or hepatocellular carcinoma (HCC). Although each patient has the same components of the immune system, immune responses vary between patients. Therefore, a deeper understanding of the immune response against HBV in different patients is necessary to understand the parameters leading to HBV cure and to use this knowledge to optimize HBV therapies. This requires seamless integration of an enormous amount of diverse and fine-grained data from viral markers, e.g., hepatitis B core-related antigen (HBcrAg) and hepatitis B surface antigen (HBsAg). The data integration system relies on the assumption that profiling human immune systems requires the analysis of various variables (e.g., demographic data, treatments, pre-existing conditions, immune cell response, or HLA-typing) rather than only one. However, the values of these variables are collected independently. They are presented in a myriad of formats, e.g., excel files, textual descriptions, lab book notes, and images of flow cytometry dot plots. Additionally, patients can be identified differently in these analyses. This heterogeneity complicates the integration of variables, as data management techniques are needed to create a unified view in which individual formats and identifiers are transparent when profiling the human immune systems. The proposed study (HBsRE) aims at integrating heterogeneous data sets of 87 chronically HBV-infected patients, e.g., clinical data, immune cell response, and HLA-typing, with knowledge encoded in biomedical ontologies and open-source databases into a knowledge-driven framework. This new technique enables us to harmonize and standardize heterogeneous datasets in the defined modeling of the data integration system, which will be evaluated in the knowledge graph (KG). KGs are data structures that represent the knowledge and data as factual statements using a graph data model. Finally, the analytic data model will be applied on top of KG in order to develop a deeper understanding of the immune profiles among various patients and to evaluate factors playing a role in a holistic profile of patients with HBsAg level loss. Additionally, our objective is to utilize this unified approach to stratify patients for new effective treatments. This study is developed in the context of the project “Transforming big data into knowledge: for deep immune profiling in vaccination, infectious diseases, and transplantation (ImProVIT)”, which is a multidisciplinary team composed of computer scientists, infection biologists, and immunologists.

Keywords: chronic hepatitis B infection, immune response, knowledge graphs, ontology

Procedia PDF Downloads 98
4225 Investigation of Soil Slopes Stability

Authors: Nima Farshidfar, Navid Daryasafar

Abstract:

In this paper, the seismic stability of reinforced soil slopes is studied using pseudo-dynamic analysis. Equilibrium equations that are applicable to the every kind of failure surface are written using Horizontal Slices Method. In written equations, the balance of the vertical and horizontal forces and moment equilibrium is fully satisfied. Failure surface is assumed to be log-spiral, and non-linear equilibrium equations obtained for the system are solved using Newton-Raphson Method. Earthquake effects are applied as horizontal and vertical pseudo-static coefficients to the problem. To solve this problem, a code was developed in MATLAB, and the critical failure surface is calculated using genetic algorithm. At the end, comparing the results obtained in this paper, effects of various parameters and the effect of using pseudo - dynamic analysis in seismic forces modeling is presented.

Keywords: soil slopes, pseudo-dynamic, genetic algorithm, optimization, limit equilibrium method, log-spiral failure surface

Procedia PDF Downloads 328
4224 Multilabel Classification with Neural Network Ensemble Method

Authors: Sezin Ekşioğlu

Abstract:

Multilabel classification has a huge importance for several applications, it is also a challenging research topic. It is a kind of supervised learning that contains binary targets. The distance between multilabel and binary classification is having more than one class in multilabel classification problems. Features can belong to one class or many classes. There exists a wide range of applications for multi label prediction such as image labeling, text categorization, gene functionality. Even though features are classified in many classes, they may not always be properly classified. There are many ensemble methods for the classification. However, most of the researchers have been concerned about better multilabel methods. Especially little ones focus on both efficiency of classifiers and pairwise relationships at the same time in order to implement better multilabel classification. In this paper, we worked on modified ensemble methods by getting benefit from k-Nearest Neighbors and neural network structure to address issues within a beneficial way and to get better impacts from the multilabel classification. Publicly available datasets (yeast, emotion, scene and birds) are performed to demonstrate the developed algorithm efficiency and the technique is measured by accuracy, F1 score and hamming loss metrics. Our algorithm boosts benchmarks for each datasets with different metrics.

Keywords: multilabel, classification, neural network, KNN

Procedia PDF Downloads 140
4223 Site-based Internship Experiences: From Research to Implementation and Community Collaboration

Authors: Jamie Sundvall, Lisa Jennings

Abstract:

Site based field internship learning (SBL) is an educational approach within a Master’s of Social Work (MSW) university field placement department that promotes a more streamlined approach to the integration of theory and evidence based practices for social work students. The SBL model is founded on research in the field, consideration of current work force needs, United States national trends of MSW graduate skill and knowledge deficits, educational trends in students pursing a master’s degree in social work, and current social problems that require unique problem solving skills. This study explores the use of site-based learning in a hybrid social work program. In this setting, site based learning pairs online education courses and social work field education to create training opportunities for social work students within their own community and cultural context. Students engage in coursework in an online setting with both synchronous and asynchronous features that facilitate development of core competencies for MSW students. Through the SBL model, students are then partnered with faculty in a virtual course room and a university vetted site within their community. The study explores how this model of learning creates community partnerships, through which students engage in a learning loop to develop social work skills, while preparing students to address current community, social, and global issues with the engagement of technology. The goal of SBL is to more effectively equip social work students for practice according to current workforce demands, provide access to education and care to populations who have limited access, and create self-sustainable partnerships. Further, the model helps students learn integration of evidence based practices and helps instructors more effectively teach integration of ethics into practice. The study found that the SBL model increases the influence and professional relevance of the social work profession, and ultimately facilitates stronger approaches to integrating theory into practice. Current implementation of the practice in the United States will be presented in the study. dditionally, future research conceptualization of SBL models will be presented, in order to collaborate on advancing best approaches of translating theory into practice, according to the current needs of the profession and needs of social work students.

Keywords: collaboration, fieldwork, research, site-based learning, technology

Procedia PDF Downloads 116