Search results for: Discontinuous cost function
954 A Two Level Load Balancing Approach for Cloud Environment
Authors: Anurag Jain, Rajneesh Kumar
Abstract:
Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.
Keywords: Cloud Analyst, Cloud Computing, Join Idle Queue, Join Shortest Queue, Load balancing, Task Scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1992953 CompPSA: A Component-Based Pairwise RNA Secondary Structure Alignment Algorithm
Authors: Ghada Badr, Arwa Alturki
Abstract:
The biological function of an RNA molecule depends on its structure. The objective of the alignment is finding the homology between two or more RNA secondary structures. Knowing the common functionalities between two RNA structures allows a better understanding and a discovery of other relationships between them. Besides, identifying non-coding RNAs -that is not translated into a protein- is a popular application in which RNA structural alignment is the first step A few methods for RNA structure-to-structure alignment have been developed. Most of these methods are partial structure-to-structure, sequence-to-structure, or structure-to-sequence alignment. Less attention is given in the literature to the use of efficient RNA structure representation and the structure-to-structure alignment methods are lacking. In this paper, we introduce an O(N2) Component-based Pairwise RNA Structure Alignment (CompPSA) algorithm, where structures are given as a component-based representation and where N is the maximum number of components in the two structures. The proposed algorithm compares the two RNA secondary structures based on their weighted component features rather than on their base-pair details. Extensive experiments are conducted illustrating the efficiency of the CompPSA algorithm when compared to other approaches and on different real and simulated datasets. The CompPSA algorithm shows an accurate similarity measure between components. The algorithm gives the flexibility for the user to align the two RNA structures based on their weighted features (position, full length, and/or stem length). Moreover, the algorithm proves scalability and efficiency in time and memory performance.Keywords: Alignment, RNA secondary structure, pairwise, component-based, data mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 974952 Determination of Poisson’s Ratio and Elastic Modulus of Compression Textile Materials
Authors: Chongyang Ye, Rong Liu
Abstract:
Compression textiles such as compression stockings (CSs) have been extensively applied for the prevention and treatment of chronic venous insufficiency of lower extremities. The involvement of multiple mechanical factors such as interface pressure, frictional force, and elastic materials make the interactions between lower limb and CSs to be complex. Determination of Poisson’s ratio and elastic moduli of CS materials are critical for constructing finite element (FE) modeling to numerically simulate a complex interactive system of CS and lower limb. In this study, a mixed approach, including an analytic model based on the orthotropic Hooke’s Law and experimental study (uniaxial tension testing and pure shear testing), has been proposed to determine Young’s modulus, Poisson’s ratio, and shear modulus of CS fabrics. The results indicated a linear relationship existing between the stress and strain properties of the studied CS samples under controlled stretch ratios (< 100%). The proposed method and the determined key mechanical properties of elastic orthotropic CS fabrics facilitate FE modeling for analyzing in-depth the effects of compression material design on their resultant biomechanical function in compression therapy.
Keywords: Elastic compression stockings, Young’s modulus, Poisson’s ratio, shear modulus, mechanical analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 417951 Using the Combined Model of PROMETHEE and Fuzzy Analytic Network Process for Determining Question Weights in Scientific Exams through Data Mining Approach
Authors: Hassan Haleh, Amin Ghaffari, Parisa Farahpour
Abstract:
Need for an appropriate system of evaluating students- educational developments is a key problem to achieve the predefined educational goals. Intensity of the related papers in the last years; that tries to proof or disproof the necessity and adequacy of the students assessment; is the corroborator of this matter. Some of these studies tried to increase the precision of determining question weights in scientific examinations. But in all of them there has been an attempt to adjust the initial question weights while the accuracy and precision of those initial question weights are still under question. Thus In order to increase the precision of the assessment process of students- educational development, the present study tries to propose a new method for determining the initial question weights by considering the factors of questions like: difficulty, importance and complexity; and implementing a combined method of PROMETHEE and fuzzy analytic network process using a data mining approach to improve the model-s inputs. The result of the implemented case study proves the development of performance and precision of the proposed model.Keywords: Assessing students, Analytic network process, Clustering, Data mining, Fuzzy sets, Multi-criteria decision making, and Preference function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1582950 A Consistency Protocol Multi-Layer for Replicas Management in Large Scale Systems
Authors: Ghalem Belalem, Yahya Slimani
Abstract:
Large scale systems such as computational Grid is a distributed computing infrastructure that can provide globally available network resources. The evolution of information processing systems in Data Grid is characterized by a strong decentralization of data in several fields whose objective is to ensure the availability and the reliability of the data in the reason to provide a fault tolerance and scalability, which cannot be possible only with the use of the techniques of replication. Unfortunately the use of these techniques has a height cost, because it is necessary to maintain consistency between the distributed data. Nevertheless, to agree to live with certain imperfections can improve the performance of the system by improving competition. In this paper, we propose a multi-layer protocol combining the pessimistic and optimistic approaches conceived for the data consistency maintenance in large scale systems. Our approach is based on a hierarchical representation model with tree layers, whose objective is with double vocation, because it initially makes it possible to reduce response times compared to completely pessimistic approach and it the second time to improve the quality of service compared to an optimistic approach.Keywords: Data Grid, replication, consistency, optimistic approach, pessimistic approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1575949 An Improved Particle Swarm Optimization Technique for Combined Economic and Environmental Power Dispatch Including Valve Point Loading Effects
Authors: Badr M. Alshammari, T. Guesmi
Abstract:
In recent years, the combined economic and emission power dispatch is one of the main problems of electrical power system. It aims to schedule the power generation of generators in order to minimize cost production and emission of harmful gases caused by fossil-fueled thermal units such as CO, CO2, NOx, and SO2. To solve this complicated multi-objective problem, an improved version of the particle swarm optimization technique that includes non-dominated sorting concept has been proposed. Valve point loading effects and system losses have been considered. The three-unit and ten-unit benchmark systems have been used to show the effectiveness of the suggested optimization technique for solving this kind of nonconvex problem. The simulation results have been compared with those obtained using genetic algorithm based method. Comparison results show that the proposed approach can provide a higher quality solution with better performance.
Keywords: Power dispatch, valve point loading effects, multiobjective optimization, Pareto solutions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 770948 In vitro and in vivo Assessment of Cholinesterase Inhibitory Activity of the Bark Extracts of Pterocarpus santalinus L. for the Treatment of Alzheimer’s Disease
Authors: K. Biswas, U. H. Armin, S. M. J. Prodhan, J. A. Prithul, S. Sarker, F. Afrin
Abstract:
Alzheimer’s disease (AD) (a progressive neurodegenerative disorder) is mostly predominant cause of dementia in the elderly. Prolonging the function of acetylcholine by inhibiting both acetylcholinesterase and butyrylcholinesterase is most effective treatment therapy of AD. Traditionally Pterocarpus santalinus L. is widely known for its medicinal use. In this study, in vitro acetylcholinesterase inhibitory activity was investigated and methanolic extract of the plant showed significant activity. To confirm this activity (in vivo), learning and memory enhancing effects were tested in mice. For the test, memory impairment was induced by scopolamine (cholinergic muscarinic receptor antagonist). Anti-amnesic effect of the extract was investigated by the passive avoidance task in mice. The study also includes brain acetylcholinesterase activity. Results proved that scopolamine induced cognitive dysfunction was significantly decreased by administration of the extract solution, in the passive avoidance task and inhibited brain acetylcholinesterase activity. These results suggest that bark extract of Pterocarpus santalinus can be better option for further studies on AD via their acetylcholinesterase inhibitory actions.
Keywords: Pterocarpus santalinus, cholinesterase inhibitor, passive avoidance, Alzheimer’s disease.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 827947 Study of Aero-thermal Effects with Heat Radiation in Optical Side Window
Authors: Chun-Chi Li, Da-Wei Huang, Yin-Chia Su, Liang-Chih Tasi
Abstract:
In hypersonic environments, the aerothermal effect makes it difficult for the optical side windows of optical guided missiles to withstand high heat. This produces cracking or breaking, resulting in an inability to function. This study used computational fluid mechanics to investigate the external cooling jet conditions of optical side windows. The turbulent models k-ε and k-ω were simulated. To be in better accord with actual aerothermal environments, a thermal radiation model was added to examine suitable amounts of external coolants and the optical window problems of aero-thermodynamics. The simulation results indicate that when there are no external cooling jets, because airflow on the optical window and the tail groove produce vortices, the temperatures in these two locations reach a peak of approximately 1600 K. When the external cooling jets worked at 0.15 kg/s, the surface temperature of the optical windows dropped to approximately 280 K. When adding thermal radiation conditions, because heat flux dissipation was faster, the surface temperature of the optical windows fell from 280 K to approximately 260 K. The difference in influence of the different turbulence models k-ε and k-ω on optical window surface temperature was not significant.Keywords: aero-optical side window, aerothermal effect, cooling, hypersonic flow
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3116946 Assesing Extension of Meeting System Performance in Information Technology in Defense and Aerospace Project
Authors: Hakan Gürkan, Ahmet Denker
Abstract:
The Ministry of Defense (MoD) spends hundreds of millions of dollars on software to support its infrastructure, operate its weapons and provide command, control, communications, computing, intelligence, surveillance, and reconnaissance (C4ISR) functions. These and other all new advanced systems have a common critical component is information technology. Defense and Aerospace environment is continuously striving to keep up with increasingly sophisticated Information Technology (IT) in order to remain effective in today-s dynamic and unpredictable threat environment. This makes it one of the largest and fastest growing expenses of Defense. Hundreds of millions of dollars spent a year on IT projects. But, too many of those millions are wasted on costly mistakes. Systems that do not work properly, new components that are not compatible with old once, trendily new applications that do not really satisfy defense needs or lost though poorly managed contracts. This paper investigates and compiles the effective strategies that aim to end exasperation with low returns and high cost of Information Technology Acquisition for defense; it tries to show how to maximize value while reducing time and expenditure.Keywords: Iterative Process, Acquisition Management, Project management, Software Economics, Requirement analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1243945 Estimating Marine Tidal Power Potential in Kenya
Authors: Lucy Patricia Onundo, Wilfred Njoroge Mwema
Abstract:
The rapidly diminishing fossil fuel reserves, their exorbitant cost and the increasingly apparent negative effect of fossil fuels to climate changes is a wake-up call to explore renewable energy. Wind, bio-fuel and solar power have already become staples of Kenyan electricity mix. The potential of electric power generation from marine tidal currents is enormous, with oceans covering more than 70% of the earth. However, attempts to harness marine tidal energy in Kenya, has yet to be studied thoroughly due to its promising, cyclic, reliable and predictable nature and the vast energy contained within it. The high load factors resulting from the fluid properties and the predictable resource characteristics make marine currents particularly attractive for power generation and advantageous when compared to others. Global-level resource assessments and oceanographic literature and data have been compiled in an analysis of the technology-specific requirements for tidal energy technologies and the physical resources. Temporal variations in resource intensity as well as the differences between small-scale applications are considered.
Keywords: Energy data assessment, environmental legislation, renewable energy, tidal-in-stream turbines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1367944 Discrete-time Phase and Delay Locked Loops Analyses in Tracking Mode
Authors: Jiri Sebesta
Abstract:
Phase locked loops (PLL) and delay locked loops (DLL) play an important role in establishing coherent references (phase of carrier and symbol timing) in digital communication systems. Fully digital receiver including digital carrier synchronizer and symbol timing synchronizer fulfils the conditions for universal multi-mode communication receiver with option of symbol rate setting over several digit places and long-term stability of requirement parameters. Afterwards it is necessary to realize PLL and DLL in synchronizer in digital form and to approach to these subsystems as a discrete representation of analog template. Analysis of discrete phase locked loop (DPLL) or discrete delay locked loop (DDLL) and technique to determine their characteristics based on analog (continuous-time) template is performed in this posed paper. There are derived transmission response and error function for 1st order discrete locked loop and resulting equations and graphical representations for 2nd order one. It is shown that the spectrum translation due to sampling takes effect at frequency characteristics computing for specific values of loop parameters.
Keywords: Carrier synchronization, coherent demodulation, software defined receiver, symbol timing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2627943 Optimization of Surface Roughness and Vibration in Turning of Aluminum Alloy AA2024 Using Taguchi Technique
Authors: Vladimir Aleksandrovich Rogov, Ghorbani Siamak
Abstract:
Determination of optimal conditions of machining parameters is important to reduce the production cost and achieve the desired surface quality. This paper investigates the influence of cutting parameters on surface roughness and natural frequency in turning of aluminum alloy AA2024. The experiments were performed at the lathe machine using two different cutting tools made of AISI 5140 and carbide cutting insert coated with TiC. Turning experiments were planned by Taguchi method L9 orthogonal array.Three levels for spindle speed, feed rate, depth of cut and tool overhang were chosen as cutting variables. The obtained experimental data has been analyzed using signal to noise ratio and analysis of variance. The main effects have been discussed and percentage contributions of various parameters affecting surface roughness and natural frequency, and optimal cutting conditions have been determined. Finally, optimization of the cutting parameters using Taguchi method was verified by confirmation experiments.
Keywords: Turning, Cutting conditions, Surface roughness, Natural frequency, Taguchi method, ANOVA, S/N ratio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4615942 Decoy-pulse Protocol for Frequency-coded Quantum Key Distribution
Authors: Sudeshna Bhattacharya, Pratyush Pandey, Pradeep Kumar K
Abstract:
We propose a decoy-pulse protocol for frequency-coded implementation of B92 quantum key distribution protocol. A direct extension of decoy-pulse method to frequency-coding scheme results in security loss as an eavesdropper can distinguish between signal and decoy pulses by measuring the carrier photon number without affecting other statistics. We overcome this problem by optimizing the ratio of carrier photon number of decoy-to-signal pulse to be as close to unity as possible. In our method the switching between signal and decoy pulses is achieved by changing the amplitude of RF signal as opposed to modulating the intensity of optical signal thus reducing system cost. We find an improvement by a factor of 100 approximately in the key generation rate using decoy-state protocol. We also study the effect of source fluctuation on key rate. Our simulation results show a key generation rate of 1.5×10-4/pulse for link lengths up to 70km. Finally, we discuss the optimum value of average photon number of signal pulse for a given key rate while also optimizing the carrier ratio.
Keywords: B92, decoy-pulse, frequency-coding, quantum key distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1727941 Stackelberg Security Game for Optimizing Security of Federated Internet of Things Platform Instances
Authors: Violeta Damjanovic-Behrendt
Abstract:
This paper presents an approach for optimal cyber security decisions to protect instances of a federated Internet of Things (IoT) platform in the cloud. The presented solution implements the repeated Stackelberg Security Game (SSG) and a model called Stochastic Human behaviour model with AttRactiveness and Probability weighting (SHARP). SHARP employs the Subjective Utility Quantal Response (SUQR) for formulating a subjective utility function, which is based on the evaluations of alternative solutions during decision-making. We augment the repeated SSG (including SHARP and SUQR) with a reinforced learning algorithm called Naïve Q-Learning. Naïve Q-Learning belongs to the category of active and model-free Machine Learning (ML) techniques in which the agent (either the defender or the attacker) attempts to find an optimal security solution. In this way, we combine GT and ML algorithms for discovering optimal cyber security policies. The proposed security optimization components will be validated in a collaborative cloud platform that is based on the Industrial Internet Reference Architecture (IIRA) and its recently published security model.
Keywords: Security, internet of things, cloud computing, Stackelberg security game, machine learning, Naïve Q-learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1647940 Risk Assessment in Durations and Costs for Construction of Industrial Facilities in Egypt Using Equations and Computer
Authors: M. Kamal Elbokl, Negadi Kheira
Abstract:
Risk Evaluation is an important step in protecting your workers and your business, as well as complying with the law. It helps you focus on the risks that really matter in your workplace – the ones with the potential to cause real harm. We are in this paper introduce basics of risk assessment then we mention some of ways to risk evaluation by computer especially Monte Carlo simulation and Microsoft project.
We use Program Evaluation and Review Technique (PERT) to deal with Risks in Industrial Facilities in Evaluation and Assessment for this risk. Using PERT Technique in Microsoft Project by the PERT toolbar and using PERTMASTER Program with Primavera Program we evaluate many hazards and make calculations for that by mathematical equation to make right decisions. We define and calculate risk factor and risk severity to ranking the type of the risk then dealing with it using in that many ways like probability computation, curves, and tables. By introducing variables in the equation of functions in computer programs we calculate the risk in the time and the cost in general case and then mention some examples in industrial facilities field.
Keywords: Risk, Industrial Facilities, PERT, Monte Carlo Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1954939 Disaggregation the Daily Rainfall Dataset into Sub-Daily Resolution in the Temperate Oceanic Climate Region
Authors: Mohammad Bakhshi, Firas Al Janabi
Abstract:
High resolution rain data are very important to fulfill the input of hydrological models. Among models of high-resolution rainfall data generation, the temporal disaggregation was chosen for this study. The paper attempts to generate three different rainfall resolutions (4-hourly, hourly and 10-minutes) from daily for around 20-year record period. The process was done by DiMoN tool which is based on random cascade model and method of fragment. Differences between observed and simulated rain dataset are evaluated with variety of statistical and empirical methods: Kolmogorov-Smirnov test (K-S), usual statistics, and Exceedance probability. The tool worked well at preserving the daily rainfall values in wet days, however, the generated data are cumulated in a shorter time period and made stronger storms. It is demonstrated that the difference between generated and observed cumulative distribution function curve of 4-hourly datasets is passed the K-S test criteria while in hourly and 10-minutes datasets the P-value should be employed to prove that their differences were reasonable. The results are encouraging considering the overestimation of generated high-resolution rainfall data.
Keywords: DiMoN tool, disaggregation, exceedance probability, Kolmogorov-Smirnov Test, rainfall.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1007938 Minimization Entropic Applied to Rotary Dryers to Reduce the Energy Consumption
Authors: I. O. Nascimento, J. T. Manzi
Abstract:
The drying process is an important operation in the chemical industry and it is widely used in the food, grain industry and fertilizer industry. However, for demanding a considerable consumption of energy, such a process requires a deep energetic analysis in order to reduce operating costs. This paper deals with thermodynamic optimization applied to rotary dryers based on the entropy production minimization, aiming at to reduce the energy consumption. To do this, the mass, energy and entropy balance was used for developing a relationship that represents the rate of entropy production. The use of the Second Law of Thermodynamics is essential because it takes into account constraints of nature. Since the entropy production rate is minimized, optimals conditions of operations can be established and the process can obtain a substantial gain in energy saving. The minimization strategy had been led using classical methods such as Lagrange multipliers and implemented in the MATLAB platform. As expected, the preliminary results reveal a significant energy saving by the application of the optimal parameters found by the procedure of the entropy minimization It is important to say that this method has shown easy implementation and low cost.Keywords: Drying, entropy minimization, modeling dryers, thermodynamic optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1430937 Secure Hashing Algorithm and Advance Encryption Algorithm in Cloud Computing
Authors: Jaimin Patel
Abstract:
Cloud computing is one of the most sharp and important movement in various computing technologies. It provides flexibility to users, cost effectiveness, location independence, easy maintenance, enables multitenancy, drastic performance improvements, and increased productivity. On the other hand, there are also major issues like security. Being a common server, security for a cloud is a major issue; it is important to provide security to protect user’s private data, and it is especially important in e-commerce and social networks. In this paper, encryption algorithms such as Advanced Encryption Standard algorithms, their vulnerabilities, risk of attacks, optimal time and complexity management and comparison with other algorithms based on software implementation is proposed. Encryption techniques to improve the performance of AES algorithms and to reduce risk management are given. Secure Hash Algorithms, their vulnerabilities, software implementations, risk of attacks and comparison with other hashing algorithms as well as the advantages and disadvantages between hashing techniques and encryption are given.
Keywords: Cloud computing, encryption algorithm, secure hashing algorithm, brute force attack, birthday attack, plaintext attack, man-in-the-middle attack.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1748936 Daily Site Risks Associated with Construction Projects and On-spot Corrective Measurements: Case Study of Revamping Projects in Kuwait Oil Company Fields Area
Authors: Yousef S. Al-Othman
Abstract:
The growth and expansion of the industrial facilities comes proportional to the market increasing demand of products and services. Furthermore, raw material producers such as oil companies usually undergo massive revamping projects to maintain a synchronized supply. These revamping projects are usually delivered through challenging construction projects held and associated with daily site risks related to the construction process. Henceforth, a case study related to these risks and corresponding on-spot corrective measurements has been made on a certain number of construction project contractors at Kuwait Oil Company (KOC) to derive the benefits and overall effectiveness of the on-spot corrective measurements during the construction phase of a project, and how would the same help in avoiding major incidents, ensuring a smooth, cost effective and on time delivery of the project. Findings of this case study shall have an added value to the overall risk management process by minimizing the daily site risks that may affect the project lead time, resulting in an undisturbed on-site construction process.
Keywords: Oil and gas, risk management, construction projects, project lead time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 892935 On the Exact Solution of Non-Uniform Torsion for Beams with Asymmetric Cross-Section
Authors: A.Campanile, M. Mandarino, V. Piscopo
Abstract:
This paper deals with the problem of non-uniform torsion in thin-walled elastic beams with asymmetric cross-section, removing the basic concept of a fixed center of twist, necessary in the Vlasov-s and Benscoter-s theories to obtain a warping stress field equivalent to zero. In this new torsion/flexure theory, despite of the classical ones, the warping function will punctually satisfy the first indefinite equilibrium equation along the beam axis and it wont- be necessary to introduce the classical congruence condition, to take into account the effect of the beam restraints. The solution, based on the Fourier development of the displacement field, is obtained assuming that the applied external torque is constant along the beam axis and on both beam ends the unit twist angle and the warping axial displacement functions are totally restrained. Finally, in order to verify the feasibility of the proposed method and to compare it with the classical theories, two applications are carried out. The first one, relative to an open profile, is necessary to test the numerical method adopted to find the solution; the second one, instead, is relative to a simplified containership section, considered as full restrained in correspondence of two adjacent transverse bulkheads.Keywords: Non-uniform torsion, Asymmetric cross-section, Fourier series, Helmholtz equation, FE method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1943934 The Impact of Semantic Web on E-Commerce
Authors: Karim Heidari
Abstract:
Semantic Web Technologies enable machines to interpret data published in a machine-interpretable form on the web. At the present time, only human beings are able to understand the product information published online. The emerging semantic Web technologies have the potential to deeply influence the further development of the Internet Economy. In this paper we propose a scenario based research approach to predict the effects of these new technologies on electronic markets and business models of traders and intermediaries and customers. Over 300 million searches are conducted everyday on the Internet by people trying to find what they need. A majority of these searches are in the domain of consumer ecommerce, where a web user is looking for something to buy. This represents a huge cost in terms of people hours and an enormous drain of resources. Agent enabled semantic search will have a dramatic impact on the precision of these searches. It will reduce and possibly eliminate information asymmetry where a better informed buyer gets the best value. By impacting this key determinant of market prices semantic web will foster the evolution of different business and economic models. We submit that there is a need for developing these futuristic models based on our current understanding of e-commerce models and nascent semantic web technologies. We believe these business models will encourage mainstream web developers and businesses to join the “semantic web revolution."Keywords: E-Commerce, E-Business, Semantic Web, XML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3461933 MPSO based Model Order Formulation Technique for SISO Continuous Systems
Authors: S. N. Deepa, G. Sugumaran
Abstract:
This paper proposes a new version of the Particle Swarm Optimization (PSO) namely, Modified PSO (MPSO) for model order formulation of Single Input Single Output (SISO) linear time invariant continuous systems. In the General PSO, the movement of a particle is governed by three behaviors namely inertia, cognitive and social. The cognitive behavior helps the particle to remember its previous visited best position. In Modified PSO technique split the cognitive behavior into two sections like previous visited best position and also previous visited worst position. This modification helps the particle to search the target very effectively. MPSO approach is proposed to formulate the higher order model. The method based on the minimization of error between the transient responses of original higher order model and the reduced order model pertaining to the unit step input. The results obtained are compared with the earlier techniques utilized, to validate its ease of computation. The proposed method is illustrated through numerical example from literature.Keywords: Continuous System, Model Order Formulation, Modified Particle Swarm Optimization, Single Input Single Output, Transfer Function Approach
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1782932 Functionalization of Polypropylene with Chiral Monomer for Improving Hemocompatibility
Authors: Xiaodong Xu, Dan Zhao, Xiujuan Chang, Chunming Li, Huiyun Zhou, Xin Li, Qiang Shi, Shifang Luan, Jinghua Yin
Abstract:
Polypropylene (PP) is one of the most commonly used plastics because of its low density, outstanding mechanical properties, and low cost. However, its drawbacks such as low surface energy, poor dyeability, lack of chemical functionalities, and poor compatibility with polar polymers and inorganic materials, have restricted the application of PP. To expand its application in biomedical materials, functionalization is considered to be the most effective way. In this study, PP was functionalized with a chiral monomer, (S)-1-acryloylpyrrolidine-2-carboxylic acid ((S)-APCA), by free-radical grafting in the solid phase. The grafting degree of PP-g-APCA was determined by chemical titration method, and the chemical structure of functionalized PP was characterized by FTIR spectroscopy, which confirmed that the chiral monomer (S)-APCA was successfully grafted onto PP. Static water contact angle results suggested that the surface hydrophilicity of PP was significantly improved by solid phase grafting and assistance of surface water treatment. Protein adsorption and platelet adhesion results showed that hemocompatibility of PP was greatly improved by grafting the chiral monomer.
Keywords: Functionalization, polypropylene, chiral monomer, hemocompatibility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1224931 Estimation of Buffer Size of Internet Gateway Server via G/M/1 Queuing Model
Authors: Dr. L.K. Singh, Dr. R. M. L, Riktesh Srivastava
Abstract:
How to efficiently assign system resource to route the Client demand by Gateway servers is a tricky predicament. In this paper, we tender an enhanced proposal for autonomous recital of Gateway servers under highly vibrant traffic loads. We devise a methodology to calculate Queue Length and Waiting Time utilizing Gateway Server information to reduce response time variance in presence of bursty traffic. The most widespread contemplation is performance, because Gateway Servers must offer cost-effective and high-availability services in the elongated period, thus they have to be scaled to meet the expected load. Performance measurements can be the base for performance modeling and prediction. With the help of performance models, the performance metrics (like buffer estimation, waiting time) can be determined at the development process. This paper describes the possible queue models those can be applied in the estimation of queue length to estimate the final value of the memory size. Both simulation and experimental studies using synthesized workloads and analysis of real-world Gateway Servers demonstrate the effectiveness of the proposed system.Keywords: Gateway Server, G/M/1 Queuing Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1594930 Automatic Reusability Appraisal of Software Components using Neuro-fuzzy Approach
Authors: Parvinder S. Sandhu, Hardeep Singh
Abstract:
Automatic reusability appraisal could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable components from existing legacy systems; that can save cost of developing the software from scratch. But the issue of how to identify reusable components from existing systems has remained relatively unexplored. In this paper, we have mentioned two-tier approach by studying the structural attributes as well as usability or relevancy of the component to a particular domain. Latent semantic analysis is used for the feature vector representation of various software domains. It exploits the fact that FeatureVector codes can be seen as documents containing terms -the idenifiers present in the components- and so text modeling methods that capture co-occurrence information in low-dimensional spaces can be used. Further, we devised Neuro- Fuzzy hybrid Inference System, which takes structural metric values as input and calculates the reusability of the software component. Decision tree algorithm is used to decide initial set of fuzzy rules for the Neuro-fuzzy system. The results obtained are convincing enough to propose the system for economical identification and retrieval of reusable software components.Keywords: Clustering, ID3, LSA, Neuro-fuzzy System, SVD
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1662929 Effects of Human Factors on Workforce Scheduling
Authors: M. Othman, N. Bhuiyan, G. J. Gouw
Abstract:
In today-s competitive market, most companies develop manufacturing systems that can help in cost reduction and maximum quality. Human issues are an important part of manufacturing systems, yet most companies ignore their effects on production performance. This paper aims to developing an integrated workforce planning system that incorporates the human being. Therefore, a multi-objective mixed integer nonlinear programming model is developed to determine the amount of hiring, firing, training, overtime for each worker type. This paper considers a workforce planning model including human aspects such as skills, training, workers- personalities, capacity, motivation, and learning rates. This model helps to minimize the hiring, firing, training and overtime costs, and maximize the workers- performance. The results indicate that the workers- differences should be considered in workforce scheduling to generate realistic plans with minimum costs. This paper also investigates the effects of human learning rates on the performance of the production systems.Keywords: Human Factors, Learning Curves, Workers' Differences, Workforce Scheduling
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1862928 Modeling Residential Electricity Consumption Function in Malaysia: Time Series Approach
Authors: L. L. Ivy-Yap, H. A. Bekhet
Abstract:
As the Malaysian residential electricity consumption continued to increase rapidly, effective energy policies, which address factors affecting residential electricity consumption, is urgently needed. This study attempts to investigate the relationship between residential electricity consumption (EC), real disposable income (Y), price of electricity (Pe) and population (Po) in Malaysia for 1978-2011 period. Unlike previous studies on Malaysia, the current study focuses on the residential sector, a sector that is important for the contemplation of energy policy. The Phillips-Perron (P-P) unit root test is employed to infer the stationarity of each variable while the bound test is executed to determine the existence of co-integration relationship among the variables, modelled in an Autoregressive Distributed Lag (ARDL) framework. The CUSUM and CUSUM of squares tests are applied to ensure the stability of the model. The results suggest the existence of long-run equilibrium relationship and bidirectional Granger causality between EC and the macroeconomic variables. The empirical findings will help policy makers of Malaysia in developing new monitoring standards of energy consumption. As it is the major contributing factor in economic growth and CO2 emission, there is a need for more proper planning in Malaysia to attain future targets in order to cut emissions.
Keywords: Co-integration, Elasticity, Granger causality, Malaysia, Residential electricity consumption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4102927 Ab initio Study of Co2ZrGe and Co2NbB Full Heusler Compounds
Authors: Abada Ahmed, Hiadsi Said, Ouahrani Tarik, Amrani Bouhalouane, Amara Kadda
Abstract:
Using the first-principles full-potential linearized augmented plane wave plus local orbital (FP-LAPW+lo) method based on density functional theory (DFT), we have investigated the electronic structure and magnetism of full Heusler alloys Co2ZrGe and Co2NbB. These compounds are predicted to be half-metallic ferromagnets (HMFs) with a total magnetic moment of 2.000 B per formula unit, well consistent with the Slater-Pauling rule. Calculations show that both the alloys have an indirect band gaps, in the minority-spin channel of density of states (DOS), with values of 0.58 eV and 0.47 eV for Co2ZrGe and Co2NbB, respectively. Analysis of the DOS and magnetic moments indicates that their magnetism is mainly related to the d-d hybridization between the Co and Zr (or Nb) atoms. The half-metallicity is found to be relatively robust against volume changes. In addition, an atom inside molecule AIM formalism and an electron localization function ELF were also adopted to study the bonding properties of these compounds, building a bridge between their electronic and bonding behavior. As they have a good crystallographic compatibility with the lattice of semiconductors used industrially and negative calculated cohesive energies with considerable absolute values these two alloys could be promising magnetic materials in the spintronic field.
Keywords: Electronic properties, full Heusler alloys, halfmetallic ferromagnets, magnetic properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2506926 Development of Numerical Model to Compute Water Hammer Transients in Pipe Flow
Authors: Jae-Young Lee, Woo-Young Jung, Myeong-Jun Nam
Abstract:
Water hammer is a hydraulic transient problem which is commonly encountered in the penstocks of hydropower plants. The numerical model was developed to estimate the transient behavior of pressure waves in pipe systems. The computational algorithm was proposed to model the water hammer phenomenon in a pipe system with pump shutdown at midstream and sudden valve closure at downstream. To predict the pressure head and flow velocity as a function of time as a result of rapidly closing a valve and pump shutdown, two boundary conditions at the ends considering pump operation and valve control can be implemented as specified equations of the pressure head and flow velocity based on the characteristics method. It was shown that the effects of transient flow make it determine the needs for protection devices, such as surge tanks, surge relief valves, or air valves, at various points in the system against overpressure and low pressure. It produced reasonably good performance with the results of the proposed transient model for pipeline systems. The proposed numerical model can be used as an efficient tool for the safety assessment of hydropower plants due to water hammer.
Keywords: Water hammer, hydraulic transient, pipe systems, characteristics method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1034925 Preconcentration and Determination of Cyproheptadine in Biological Samples by Hollow Fiber Liquid Phase Microextraction Coupled with High Performance Liquid Chromatography
Authors: Najari Moghadam Sh., Qomi M., Raofie F., Khadiv J.
Abstract:
In this study, a liquid phase microextraction by hollow fiber (HF-LPME) combined with high performance liquid chromatography-UV detector was applied to preconcentrate and determine trace levels of Cyproheptadine in human urine and plasma samples. Cyproheptadine was extracted from 10 mL alkaline aqueous solution (pH: 9.81) into an organic solvent (n-octnol) which was immobilized in the wall pores of a hollow fiber. Then was back-extracted into an acidified aqueous solution (pH: 2.59) located inside the lumen of the hollow fiber. This method is simple, efficient and cost-effective. It is based on pH gradient and differences between two aqueous phases. In order to optimize the HF-LPME some affecting parameters including the pH of donor and acceptor phases, the type of organic solvent, ionic strength, stirring rate, extraction time and temperature were studied and optimized. Under optimal conditions enrichment factor, limit of detection (LOD) and relative standard deviation (RSD(%), n=3) were up to 112, 15 μg.L−1 and 2.7, respectively.
Keywords: Biological samples, Cyproheptadine, hollow fiber, liquid phase microextraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2233