Search results for: Scheduling rules selection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1769

Search results for: Scheduling rules selection

1109 Secure Resource Selection in Computational Grid Based on Quantitative Execution Trust

Authors: G.Kavitha, V.Sankaranarayanan

Abstract:

Grid computing provides a virtual framework for controlled sharing of resources across institutional boundaries. Recently, trust has been recognised as an important factor for selection of optimal resources in a grid. We introduce a new method that provides a quantitative trust value, based on the past interactions and present environment characteristics. This quantitative trust value is used to select a suitable resource for a job and eliminates run time failures arising from incompatible user-resource pairs. The proposed work will act as a tool to calculate the trust values of the various components of the grid and there by improves the success rate of the jobs submitted to the resource on the grid. The access to a resource not only depend on the identity and behaviour of the resource but also upon its context of transaction, time of transaction, connectivity bandwidth, availability of the resource and load on the resource. The quality of the recommender is also evaluated based on the accuracy of the feedback provided about a resource. The jobs are submitted for execution to the selected resource after finding the overall trust value of the resource. The overall trust value is computed with respect to the subjective and objective parameters.

Keywords: access control, feedback, grid computing, reputation, security, trust, trust parameter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1482
1108 Mining Image Features in an Automatic Two-Dimensional Shape Recognition System

Authors: R. A. Salam, M.A. Rodrigues

Abstract:

The number of features required to represent an image can be very huge. Using all available features to recognize objects can suffer from curse dimensionality. Feature selection and extraction is the pre-processing step of image mining. Main issues in analyzing images is the effective identification of features and another one is extracting them. The mining problem that has been focused is the grouping of features for different shapes. Experiments have been conducted by using shape outline as the features. Shape outline readings are put through normalization and dimensionality reduction process using an eigenvector based method to produce a new set of readings. After this pre-processing step data will be grouped through their shapes. Through statistical analysis, these readings together with peak measures a robust classification and recognition process is achieved. Tests showed that the suggested methods are able to automatically recognize objects through their shapes. Finally, experiments also demonstrate the system invariance to rotation, translation, scale, reflection and to a small degree of distortion.

Keywords: Image mining, feature selection, shape recognition, peak measures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1452
1107 The Problem of Reconciling the Principle of Confidentiality in Foreign Investment Arbitration with the Public Interest

Authors: Bárbara Magalhães Bravo, Cláudia Figueiras

Abstract:

The economical globalization through the liberalization of the markets and capitals boosted the economical development of the nations and the needs for sorting out the disputes arising from the foreign investment. The arbitration, for all the inherent advantages, such as swiftness, arbitrators’ specialise skills and impartiality sets a pacifier tool for the interest in account. Safeguarded the public interest, we face the problem of the confidentiality in the arbitration. The urgent development of impelling mechanisms concerning transparency, guaranty and protection of the interest in account, reveals itself urgent. Through a bibliography review, we will dense the state of art, by going through the several solutions concerning, and pointing out the most suitable. Through the jurisprudential analysis we will point out the solution for the conflict confidentiality/public interest. The transparency, inextricable from the public interest, imposes the arbitration process can be open to all citizens. Transparency rules have been considered at the UNCITRAL in attempting to conciliate the necessity of publicity and the public interest, however still insufficient. The arbitration of foreign investment carries consequences to the citizens of the State. Articulating mechanisms between the arbitral procedures secrecy and the public interest should be adopted. The arbitration of foreign investment, being a tertius genius between the international arbitration and the administrative arbitration would claim its own regulation in each and every States where the confidentiality rules and its exceptions could be identified. One should enquiry where the limit of the citizens’ individual rights protection and the public interest should give way to the principle of transparency

Keywords: Arbitration, foreign investment, transparency, confidentiality, international centre for settlement of investment disputes UNCITRAL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 760
1106 Pharmacology Applied Learning Program in Preclinical Years – Student Perspectives

Authors: Amudha Kadirvelu, Sunil Gurtu, Sivalal Sadasivan

Abstract:

Pharmacology curriculum plays an integral role in medical education. Learning pharmacology to choose and prescribe drugs is a major challenge encountered by students. We developed pharmacology applied learning activities for first year medical students that included realistic clinical situations with escalating complications which required the students to analyze the situation and think critically to choose a safe drug. Tutor feedback was provided at the end of session. Evaluation was done to assess the students- level of interest and usefulness of the sessions in rational selection of drugs. Majority (98 %) of the students agreed that the session was an extremely useful learning exercise and agreed that similar sessions would help in rational selection of drugs. Applied learning sessions in the early years of medical program may promote deep learning and bridge the gap between pharmacology theory and clinical practice. Besides, it may also enhance safe prescribing skills.

Keywords: Medical education, pharmacology curriculum, applied learning, safe prescribing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2187
1105 Performing Diagnosis in Building with Partially Valid Heterogeneous Tests

Authors: Houda Najeh, Mahendra Pratap Singh, Stéphane Ploix, Antoine Caucheteux, Karim Chabir, Mohamed Naceur Abdelkrim

Abstract:

Building system is highly vulnerable to different kinds of faults and human misbehaviors. Energy efficiency and user comfort are directly targeted due to abnormalities in building operation. The available fault diagnosis tools and methodologies particularly rely on rules or pure model-based approaches. It is assumed that model or rule-based test could be applied to any situation without taking into account actual testing contexts. Contextual tests with validity domain could reduce a lot of the design of detection tests. The main objective of this paper is to consider fault validity when validate the test model considering the non-modeled events such as occupancy, weather conditions, door and window openings and the integration of the knowledge of the expert on the state of the system. The concept of heterogeneous tests is combined with test validity to generate fault diagnoses. A combination of rules, range and model-based tests known as heterogeneous tests are proposed to reduce the modeling complexity. Calculation of logical diagnoses coming from artificial intelligence provides a global explanation consistent with the test result. An application example shows the efficiency of the proposed technique: an office setting at Grenoble Institute of Technology.

Keywords: Heterogeneous tests, validity, building system, sensor grids, sensor fault, diagnosis, fault detection and isolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 638
1104 Integrating Computational Intelligence Techniques and Assessment Agents in ELearning Environments

Authors: Konstantinos C. Giotopoulos, Christos E. Alexakos, Grigorios N. Beligiannis, Spiridon D.Likothanassis

Abstract:

In this contribution an innovative platform is being presented that integrates intelligent agents and evolutionary computation techniques in legacy e-learning environments. It introduces the design and development of a scalable and interoperable integration platform supporting: I) various assessment agents for e-learning environments, II) a specific resource retrieval agent for the provision of additional information from Internet sources matching the needs and profile of the specific user and III) a genetic algorithm designed to extract efficient information (classifying rules) based on the students- answering input data. The agents are implemented in order to provide intelligent assessment services based on computational intelligence techniques such as Bayesian Networks and Genetic Algorithms. The proposed Genetic Algorithm (GA) is used in order to extract efficient information (classifying rules) based on the students- answering input data. The idea of using a GA in order to fulfil this difficult task came from the fact that GAs have been widely used in applications including classification of unknown data. The utilization of new and emerging technologies like web services allows integrating the provided services to any web based legacy e-learning environment.

Keywords: Bayesian Networks, Computational Intelligencetechniques, E-learning legacy systems, Service Oriented Integration, Intelligent Agents, Genetic Algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737
1103 Aircraft Selection Problem Using Decision Uncertainty Distance in Fuzzy Multiple Criteria Decision Making Analysis

Authors: C. Ardil

Abstract:

Aircraft have different capabilities and specifications according to the required strategic goals and objectives in operations. With various types on the market with different aircraft characteristics, it becomes difficult to select a suitable aircraft for certain operations and requirements. The entropy weighting method (EWM) is a useful, highly consistent, and reliable method for obtaining the weights of the criteria and is worth integrating with the decision uncertainty distance (DUD) method, which is more applicable and requires less computation than other methods. An illustrative example is presented to demonstrate the validity and usability of the proposed methodology. Comparing the ranking results matches the distance-based approach, which is the technique for order preference by similarity to ideal solution (TOPSIS) method, which shows the robustness of the entropy DUD hybrid method. Validity analysis shows that the proposed hybrid multiple criteria decision-making analysis (MCDMA) methodology is quantitatively stable and reliable.

Keywords: aircraft selection, decision uncertainty distance (DUD), multiple criteria decision making analysis, MCDMA, TOPSIS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 529
1102 Centre Of Mass Selection Operator Based Meta-Heuristic For Unbounded Knapsack Problem

Authors: D.Venkatesan, K.Kannan, S. Raja Balachandar

Abstract:

In this paper a new Genetic Algorithm based on a heuristic operator and Centre of Mass selection operator (CMGA) is designed for the unbounded knapsack problem(UKP), which is NP-Hard combinatorial optimization problem. The proposed genetic algorithm is based on a heuristic operator, which utilizes problem specific knowledge. This center of mass operator when combined with other Genetic Operators forms a competitive algorithm to the existing ones. Computational results show that the proposed algorithm is capable of obtaining high quality solutions for problems of standard randomly generated knapsack instances. Comparative study of CMGA with simple GA in terms of results for unbounded knapsack instances of size up to 200 show the superiority of CMGA. Thus CMGA is an efficient tool of solving UKP and this algorithm is competitive with other Genetic Algorithms also.

Keywords: Genetic Algorithm, Unbounded Knapsack Problem, Combinatorial Optimization, Meta-Heuristic, Center of Mass

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1693
1101 Unmanned Combat Aircraft Selection using Fuzzy Proximity Measure Method in Multiple Criteria Group Decision Making

Authors: C. Ardil

Abstract:

The decision to select an unmanned combat aircraft is complicated since several options and conflicting criteria must be considered at simultaneously. When making multiple criteria decision, it is important to consider the selected evaluation criteria, including priceability, payloadability, stealthability, speedability , and survivability. The fundamental goal of the study is to select the best unmanned combat aircraft by taking these evaluation criteria into account. The optimal aircraft was chosen using the fuzzy proximity measure method, which enables decision-makers to designate preferences as standard fuzzy set numbers during the multiple criteria decision-making process. To assess the applicability of the proposed approach, a numerical example is provided. Finally, by comparing determined unmanned combat aircraft, the proposed method produced a successful application, and the best aircraft was selected.

Keywords: standard fuzzy sets (SFS), unmanned combat aircraft selection, multiple criteria decision making (MCDM), multiple criteria group decision making (MCGDM), proximity measure method (PMM)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 413
1100 Optimized Preprocessing for Accurate and Efficient Bioassay Prediction with Machine Learning Algorithms

Authors: Jeff Clarine, Chang-Shyh Peng, Daisy Sang

Abstract:

Bioassay is the measurement of the potency of a chemical substance by its effect on a living animal or plant tissue. Bioassay data and chemical structures from pharmacokinetic and drug metabolism screening are mined from and housed in multiple databases. Bioassay prediction is calculated accordingly to determine further advancement. This paper proposes a four-step preprocessing of datasets for improving the bioassay predictions. The first step is instance selection in which dataset is categorized into training, testing, and validation sets. The second step is discretization that partitions the data in consideration of accuracy vs. precision. The third step is normalization where data are normalized between 0 and 1 for subsequent machine learning processing. The fourth step is feature selection where key chemical properties and attributes are generated. The streamlined results are then analyzed for the prediction of effectiveness by various machine learning algorithms including Pipeline Pilot, R, Weka, and Excel. Experiments and evaluations reveal the effectiveness of various combination of preprocessing steps and machine learning algorithms in more consistent and accurate prediction.

Keywords: Bioassay, machine learning, preprocessing, virtual screen.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 976
1099 Dimensionality Reduction in Modal Analysis for Structural Health Monitoring

Authors: Elia Favarelli, Enrico Testi, Andrea Giorgetti

Abstract:

Autonomous structural health monitoring (SHM) of many structures and bridges became a topic of paramount importance for maintenance purposes and safety reasons. This paper proposes a set of machine learning (ML) tools to perform automatic feature selection and detection of anomalies in a bridge from vibrational data and compare different feature extraction schemes to increase the accuracy and reduce the amount of data collected. As a case study, the Z-24 bridge is considered because of the extensive database of accelerometric data in both standard and damaged conditions. The proposed framework starts from the first four fundamental frequencies extracted through operational modal analysis (OMA) and clustering, followed by time-domain filtering (tracking). The fundamental frequencies extracted are then fed to a dimensionality reduction block implemented through two different approaches: feature selection (intelligent multiplexer) that tries to estimate the most reliable frequencies based on the evaluation of some statistical features (i.e., entropy, variance, kurtosis), and feature extraction (auto-associative neural network (ANN)) that combine the fundamental frequencies to extract new damage sensitive features in a low dimensional feature space. Finally, one-class classification (OCC) algorithms perform anomaly detection, trained with standard condition points, and tested with normal and anomaly ones. In particular, principal component analysis (PCA), kernel principal component analysis (KPCA), and autoassociative neural network (ANN) are presented and their performance are compared. It is also shown that, by evaluating the correct features, the anomaly can be detected with accuracy and an F1 score greater than 95%.

Keywords: Anomaly detection, dimensionality reduction, frequencies selection, modal analysis, neural network, structural health monitoring, vibration measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 696
1098 Slovenian Text-to-Speech Synthesis for Speech User Interfaces

Authors: Jerneja Žganec Gros, Aleš Mihelič, Nikola Pavešić, Mario Žganec, Stanislav Gruden

Abstract:

The paper presents the design concept of a unitselection text-to-speech synthesis system for the Slovenian language. Due to its modular and upgradable architecture, the system can be used in a variety of speech user interface applications, ranging from server carrier-grade voice portal applications, desktop user interfaces to specialized embedded devices. Since memory and processing power requirements are important factors for a possible implementation in embedded devices, lexica and speech corpora need to be reduced. We describe a simple and efficient implementation of a greedy subset selection algorithm that extracts a compact subset of high coverage text sentences. The experiment on a reference text corpus showed that the subset selection algorithm produced a compact sentence subset with a small redundancy. The adequacy of the spoken output was evaluated by several subjective tests as they are recommended by the International Telecommunication Union ITU.

Keywords: text-to-speech synthesis, prosody modeling, speech user interface.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1450
1097 Per Flow Packet Scheduling Scheme to Improve the End-to-End Fairness in Mobile Ad Hoc Wireless Network

Authors: K. Sasikala, R. S. D Wahidabanu

Abstract:

Various fairness models and criteria proposed by academia and industries for wired networks can be applied for ad hoc wireless network. The end-to-end fairness in an ad hoc wireless network is a challenging task compared to wired networks, which has not been addressed effectively. Most of the traffic in an ad hoc network are transport layer flows and thus the fairness of transport layer flows has attracted the interest of the researchers. The factors such as MAC protocol, routing protocol, the length of a route, buffer size, active queue management algorithm and the congestion control algorithms affects the fairness of transport layer flows. In this paper, we have considered the rate of data transmission, the queue management and packet scheduling technique. The ad hoc network is dynamic in nature due to various parameters such as transmission of control packets, multihop nature of forwarding packets, changes in source and destination nodes, changes in the routing path influences determining throughput and fairness among the concurrent flows. In addition, the effect of interaction between the protocol in the data link and transport layers has also plays a role in determining the rate of the data transmission. We maintain queue for each flow and the delay information of each flow is maintained accordingly. The pre-processing of flow is done up to the network layer only. The source and destination address information is used for separating the flow and the transport layer information is not used. This minimizes the delay in the network. Each flow is attached to a timer and is updated dynamically. Finite State Machine (FSM) is proposed for queue and transmission control mechanism. The performance of the proposed approach is evaluated in ns-2 simulation environment. The throughput and fairness based on mobility for different flows used as performance metrics. We have compared the performance of the proposed approach with ATP and the transport layer information is used. This minimizes the delay in the network. Each flow is attached to a timer and is updated dynamically. Finite State Machine (FSM) is proposed for queue and transmission control mechanism. The performance of the proposed approach is evaluated in ns-2 simulation environment. The throughput and fairness based on not mobility for different flows used as performance metrics. We have compared the performance of the proposed approach with ATP and MC-MLAS and the performance of the proposed approach is encouraging.

Keywords: ATP, End-to-End fairness, FSM, MAC, QoS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1983
1096 Improving Fake News Detection Using K-means and Support Vector Machine Approaches

Authors: Kasra Majbouri Yazdi, Adel Majbouri Yazdi, Saeid Khodayi, Jingyu Hou, Wanlei Zhou, Saeed Saedy

Abstract:

Fake news and false information are big challenges of all types of media, especially social media. There is a lot of false information, fake likes, views and duplicated accounts as big social networks such as Facebook and Twitter admitted. Most information appearing on social media is doubtful and in some cases misleading. They need to be detected as soon as possible to avoid a negative impact on society. The dimensions of the fake news datasets are growing rapidly, so to obtain a better result of detecting false information with less computation time and complexity, the dimensions need to be reduced. One of the best techniques of reducing data size is using feature selection method. The aim of this technique is to choose a feature subset from the original set to improve the classification performance. In this paper, a feature selection method is proposed with the integration of K-means clustering and Support Vector Machine (SVM) approaches which work in four steps. First, the similarities between all features are calculated. Then, features are divided into several clusters. Next, the final feature set is selected from all clusters, and finally, fake news is classified based on the final feature subset using the SVM method. The proposed method was evaluated by comparing its performance with other state-of-the-art methods on several specific benchmark datasets and the outcome showed a better classification of false information for our work. The detection performance was improved in two aspects. On the one hand, the detection runtime process decreased, and on the other hand, the classification accuracy increased because of the elimination of redundant features and the reduction of datasets dimensions.

Keywords: Fake news detection, feature selection, support vector machine, K-means clustering, machine learning, social media.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4509
1095 Hierarchical PSO-Adaboost Based Classifiers for Fast and Robust Face Detection

Authors: Hong Pan, Yaping Zhu, Liang Zheng Xia

Abstract:

We propose a fast and robust hierarchical face detection system which finds and localizes face images with a cascade of classifiers. Three modules contribute to the efficiency of our detector. First, heterogeneous feature descriptors are exploited to enrich feature types and feature numbers for face representation. Second, a PSO-Adaboost algorithm is proposed to efficiently select discriminative features from a large pool of available features and reinforce them into the final ensemble classifier. Compared with the standard exhaustive Adaboost for feature selection, the new PSOAdaboost algorithm reduces the training time up to 20 times. Finally, a three-stage hierarchical classifier framework is developed for rapid background removal. In particular, candidate face regions are detected more quickly by using a large size window in the first stage. Nonlinear SVM classifiers are used instead of decision stump functions in the last stage to remove those remaining complex nonface patterns that can not be rejected in the previous two stages. Experimental results show our detector achieves superior performance on the CMU+MIT frontal face dataset.

Keywords: Adaboost, Face detection, Feature selection, PSO

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2192
1094 Power System Damping Using Hierarchical Fuzzy Multi- Input PSS and Communication Lines Active Power Deviations Input and SVC

Authors: Mohammad Hasan Raouf, Ahmad Rouhani, Mohammad Abedini, Ebrahim Rasooli Anarmarzi

Abstract:

In this paper the application of a hierarchical fuzzy system (HFS) based on MPSS and SVC in multi-machine environment is studied. Also the effect of communication lines active power variance signal between two ΔPTie-line regions, as one of the inputs of hierarchical fuzzy multi-input PSS and SVC (HFMPSS & SVC), on the increase of low frequency oscillation damping is examined. In the MPSS, to have better efficiency an auxiliary signal of reactive power deviation (ΔQ) is added with ΔP+ Δω input type PSS. The number of rules grows exponentially with the number of variables in a classic fuzzy system. To reduce the number of rules the HFS consists of a number of low-dimensional fuzzy systems in a hierarchical structure. Phasor model of SVC is described and used in this paper. The performances of MPSS and ΔPTie-line based HFMPSS and also the proposed method in damping inter-area mode of oscillation are examined in response to disturbances. The efficiency of the proposed model is examined by simulating a four-machine power system. Results show that the proposed method is performing satisfactorily within the whole range of disturbances and reduces the cost of system.

Keywords: Communication lines active power variance signal, Hierarchical fuzzy system (HFS), Multi-input power system stabilizer (MPSS), Static VAR compensator (SVC).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1665
1093 Visual Attention Analysis on Mutated Brand Name using Eye-Tracking: A Case Study

Authors: Anirban Chowdhury, Sougata Karmakar, Swathi Matta Reddy, Sanjog J., Subrata Ghosh, Debkumar Chakrabarti

Abstract:

Brand name plays a vital role for in-shop buying behavior of consumers and mutated brand name may affect the selling of leading branded products. In Indian market, there are many products with mutated brand names which are either orthographically or phonologically similar. Due to presence of such products, Indian consumers very often fall under confusion when buying some regularly used stuff. Authors of the present paper have attempted to demonstrate relationship between less attention and false recognition of mutated brand names during a product selection process. To achieve this goal, visual attention study was conducted on 15 male college students using eye-tracker against a mutated brand name and errors in recognition were noted using questionnaire. Statistical analysis of the acquired data revealed that there was more false recognition of mutated brand name when less attention was paid during selection of favorite product. Moreover, it was perceived that eye tracking is an effective tool for analyzing false recognition of brand name mutation.

Keywords: Brand Name Mutation, Consumer Behavior, Visual Attention, Orthography

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2528
1092 Transformation of Course Timetablinng Problem to RCPSP

Authors: M. Ahmad, M. Gourgand, C. Caux

Abstract:

The Resource-Constrained Project Scheduling Problem (RCPSP) is concerned with single-item or small batch production where limited resources have to be allocated to dependent activities over time. Over the past few decades, a lot of work has been made with the use of optimal solution procedures for this basic problem type and its extensions. Brucker and Knust[1] discuss, how timetabling problems can be modeled as a RCPSP. Authors discuss high school timetabling and university course timetabling problem as an example. We have formulated two mathematical formulations of course timetabling problem in a new way which are the prototype of single-mode RCPSP. Our focus is to show, how course timetabling problem can be transformed into RCPSP. We solve this transformation model with genetic algorithm.

Keywords: Course Timetabling, Integer programming, Combinatorial optimizations

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2011
1091 Thread Lift: Classification, Technique, and How to Approach to the Patient

Authors: Panprapa Yongtrakul, Punyaphat Sirithanabadeekul, Pakjira Siriphan

Abstract:

Background: The thread lift technique has become popular because it is less invasive, requires a shorter operation, less downtime, and results in fewer postoperative complications. The advantage of the technique is that the thread can be inserted under the skin without the need for long incisions. Currently, there are a lot of thread lift techniques with respect to the specific types of thread used on specific areas, such as the mid-face, lower face, or neck area. Objective: To review the thread lift technique for specific areas according to type of thread, patient selection, and how to match the most appropriate to the patient. Materials and Methods: A literature review technique was conducted by searching PubMed and MEDLINE, then compiled and summarized. Result: We have divided our protocols into two sections: Protocols for short suture, and protocols for long suture techniques. We also created 3D pictures for each technique to enhance understanding and application in a clinical setting. Conclusion: There are advantages and disadvantages to short suture and long suture techniques. The best outcome for each patient depends on appropriate patient selection and determining the most suitable technique for the defect and area of patient concern.

Keywords: Thread lift, thread lift method, thread lift technique, thread lift procedure, threading.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10188
1090 Operating System Based Virtualization Models in Cloud Computing

Authors: Dev Ras Pandey, Bharat Mishra, S. K. Tripathi

Abstract:

Cloud computing is ready to transform the structure of businesses and learning through supplying the real-time applications and provide an immediate help for small to medium sized businesses. The ability to run a hypervisor inside a virtual machine is important feature of virtualization and it is called nested virtualization. In today’s growing field of information technology, many of the virtualization models are available, that provide a convenient approach to implement, but decision for a single model selection is difficult. This paper explains the applications of operating system based virtualization in cloud computing with an appropriate/suitable model with their different specifications and user’s requirements. In the present paper, most popular models are selected, and the selection was based on container and hypervisor based virtualization. Selected models were compared with a wide range of user’s requirements as number of CPUs, memory size, nested virtualization supports, live migration and commercial supports, etc. and we identified a most suitable model of virtualization.

Keywords: Virtualization, OS based virtualization, container and hypervisor based virtualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1937
1089 A Multiple-State Based Power Control for Multi-Radio Multi-Channel Wireless Mesh Networks

Authors: T. O. Olwal, K. Djouani, B. J. van Wyk, Y. Hamam, P. Siarry, N. Ntlatlapa

Abstract:

Multi-Radio Multi-Channel (MRMC) systems are key to power control problems in wireless mesh networks (WMNs). In this paper, we present asynchronous multiple-state based power control for MRMC WMNs. First, WMN is represented as a set of disjoint Unified Channel Graphs (UCGs). Second, each network interface card (NIC) or radio assigned to a unique UCG adjusts transmission power using predicted multiple interaction state variables (IV) across UCGs. Depending on the size of queue loads and intra- and inter-channel states, each NIC optimizes transmission power locally and asynchronously. A new power selection MRMC unification protocol (PMMUP) is proposed that coordinates interactions among radios. The efficacy of the proposed method is investigated through simulations.

Keywords: Asynchronous convergence, Multi-Radio Multi-Channel (MRMC), Power Selection Multi-Radio Multi-Channel Unification Protocol (PMMUP) and Wireless Mesh Networks(WMNs)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1598
1088 A Study of Cooperative Co-evolutionary Genetic Algorithm for Solving Flexible Job Shop Scheduling Problem

Authors: Lee Yih Rou, Hishammuddin Asmuni

Abstract:

Flexible Job Shop Problem (FJSP) is an extension of classical Job Shop Problem (JSP). The FJSP extends the routing flexibility of the JSP, i.e assigning machine to an operation. Thus it makes it more difficult than the JSP. In this study, Cooperative Coevolutionary Genetic Algorithm (CCGA) is presented to solve the FJSP. Makespan (time needed to complete all jobs) is used as the performance evaluation for CCGA. In order to test performance and efficiency of our CCGA the benchmark problems are solved. Computational result shows that the proposed CCGA is comparable with other approaches.

Keywords: Co-evolution, Genetic Algorithm (GA), Flexible JobShop Problem(FJSP)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1783
1087 Comparison of Different Types of Sources of Traffic Using SFQ Scheduling Discipline

Authors: Alejandro Gomez Suarez, H. Srikanth Kamath

Abstract:

In this paper, SFQ (Start Time Fair Queuing) algorithm is analyzed when this is applied in computer networks to know what kind of behavior the traffic in the net has when different data sources are managed by the scheduler. Using the NS2 software the computer networks were simulated to be able to get the graphs showing the performance of the scheduler. Different traffic sources were introduced in the scripts, trying to establish the real scenario. Finally the results were that depending on the data source, the traffic can be affected in different levels, when Constant Bite Rate is applied, the scheduler ensures a constant level of data sent and received, but the truth is that in the real life it is impossible to ensure a level that resists the changes in work load.

Keywords: Cbq, Cbr, Nam, Ns2.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2133
1086 Achieving Environmentally Sustainable Supply Chain in Textile and Apparel Industries

Authors: Faisal Bin Alam

Abstract:

Most of the manufacturing entities cause negative footprint to nature that demand due attention. Textile industries have one of the longest supply chains and bear the liability of significant environmental impact to our planet. Issues of environmental safety, scarcity of energy and resources, and demand for eco-friendly products have driven research to search for safe and suitable alternatives in apparel processing. Consumer awareness, increased pressure from fashion brands and actions from local legislative authorities have somewhat been able to improve the practices. Objective of this paper is to reveal the best selection of raw materials and methods of production, taking environmental sustainability into account. Methodology used in this study is exploratory in nature based on personal experience, field visits in the factories of Bangladesh and secondary sources. Findings are limited to exploring better alternatives to conventional operations of a Readymade Garment manufacturing, from fibre selection to final product delivery, therefore showing some ways of achieving greener environment in the supply chain of a clothing industry.

Keywords: Textile and apparel, environment, sustainability, supply chain, production, clothing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1531
1085 Ontology-based Concept Weighting for Text Documents

Authors: Hmway Hmway Tar, Thi Thi Soe Nyaunt

Abstract:

Documents clustering become an essential technology with the popularity of the Internet. That also means that fast and high-quality document clustering technique play core topics. Text clustering or shortly clustering is about discovering semantically related groups in an unstructured collection of documents. Clustering has been very popular for a long time because it provides unique ways of digesting and generalizing large amounts of information. One of the issues of clustering is to extract proper feature (concept) of a problem domain. The existing clustering technology mainly focuses on term weight calculation. To achieve more accurate document clustering, more informative features including concept weight are important. Feature Selection is important for clustering process because some of the irrelevant or redundant feature may misguide the clustering results. To counteract this issue, the proposed system presents the concept weight for text clustering system developed based on a k-means algorithm in accordance with the principles of ontology so that the important of words of a cluster can be identified by the weight values. To a certain extent, it has resolved the semantic problem in specific areas.

Keywords: Clustering, Concept Weight, Document clustering, Feature Selection, Ontology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2399
1084 Modeling Uncertainty in Multiple Criteria Decision Making Using the Technique for Order Preference by Similarity to Ideal Solution for the Selection of Stealth Combat Aircraft

Authors: C. Ardil

Abstract:

Uncertainty set theory is a generalization of fuzzy set theory and intuitionistic fuzzy set theory. It serves as an effective tool for dealing with inconsistent, imprecise, and vague information. The technique for order preference by similarity to ideal solution (TOPSIS) method is a multiple-attribute method used to identify solutions from a finite set of alternatives. It simultaneously minimizes the distance from an ideal point and maximizes the distance from a nadir point. In this paper, an extension of the TOPSIS method for multiple attribute group decision-making (MAGDM) based on uncertainty sets is presented. In uncertainty decision analysis, decision-makers express information about attribute values and weights using uncertainty numbers to select the best stealth combat aircraft.

Keywords: Uncertainty set, stealth combat aircraft selection multiple criteria decision-making analysis, MCDM, uncertainty decision analysis, TOPSIS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 128
1083 Modeling and Optimization of Aggregate Production Planning - A Genetic Algorithm Approach

Authors: B. Fahimnia, L.H.S. Luong, R. M. Marian

Abstract:

The Aggregate Production Plan (APP) is a schedule of the organization-s overall operations over a planning horizon to satisfy demand while minimizing costs. It is the baseline for any further planning and formulating the master production scheduling, resources, capacity and raw material planning. This paper presents a methodology to model the Aggregate Production Planning problem, which is combinatorial in nature, when optimized with Genetic Algorithms. This is done considering a multitude of constraints of contradictory nature and the optimization criterion – overall cost, made up of costs with production, work force, inventory, and subcontracting. A case study of substantial size, used to develop the model, is presented, along with the genetic operators.

Keywords: Aggregate Production Planning, Costs, and Optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2577
1082 Dimensionality Reduction of PSSM Matrix and its Influence on Secondary Structure and Relative Solvent Accessibility Predictions

Authors: Rafał Adamczak

Abstract:

State-of-the-art methods for secondary structure (Porter, Psi-PRED, SAM-T99sec, Sable) and solvent accessibility (Sable, ACCpro) predictions use evolutionary profiles represented by the position specific scoring matrix (PSSM). It has been demonstrated that evolutionary profiles are the most important features in the feature space for these predictions. Unfortunately applying PSSM matrix leads to high dimensional feature spaces that may create problems with parameter optimization and generalization. Several recently published suggested that applying feature extraction for the PSSM matrix may result in improvements in secondary structure predictions. However, none of the top performing methods considered here utilizes dimensionality reduction to improve generalization. In the present study, we used simple and fast methods for features selection (t-statistics, information gain) that allow us to decrease the dimensionality of PSSM matrix by 75% and improve generalization in the case of secondary structure prediction compared to the Sable server.

Keywords: Secondary structure prediction, feature selection, position specific scoring matrix.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927
1081 Performance Analysis of MATLAB Solvers in the Case of a Quadratic Programming Generation Scheduling Optimization Problem

Authors: Dávid Csercsik, Péter Kádár

Abstract:

In the case of the proposed method, the problem is parallelized by considering multiple possible mode of operation profiles, which determine the range in which the generators operate in each period. For each of these profiles, the optimization is carried out independently, and the best resulting dispatch is chosen. For each such profile, the resulting problem is a quadratic programming (QP) problem with a potentially negative definite Q quadratic term, and constraints depending on the actual operation profile. In this paper we analyze the performance of available MATLAB optimization methods and solvers for the corresponding QP.

Keywords: Economic dispatch, optimization, quadratic programming, MATLAB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 941
1080 Ec-A: A Task Allocation Algorithm for Energy Minimization in Multiprocessor Systems

Authors: Anju S. Pillai, T.B. Isha

Abstract:

With the necessity of increased processing capacity with less energy consumption; power aware multiprocessor system has gained more attention in the recent future. One of the additional challenges that is to be solved in a multi-processor system when compared to uni-processor system is job allocation. This paper presents a novel task dependent job allocation algorithm: Energy centric- Allocation (Ec-A) and Rate Monotonic (RM) scheduling to minimize energy consumption in a multiprocessor system. A simulation analysis is carried out to verify the performance increase with reduction in energy consumption and required number of processors in the system.

Keywords: Energy consumption, Job allocation, Multiprocessor systems, Task dependent.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2179