Search results for: generating sets
1386 Design Improvement of Worm Gearing for Better Energy Utilization
Authors: Ahmed Elkholy
Abstract:
Most power transmission cases use gearing in general, and worm gearing, in particular for energy utilization. Therefore, designing gears for minimum weight and maximum power transmission is the main target of this study. In this regard, a new approach has been developed to estimate the load share and stress distribution of worm gear sets. The approach is based upon considering the instantaneous tooth meshing stiffness where the worm gear drive was modelled as a series of spur gear slices, and each slice was analyzed separately using a well-established criteria. By combining the results obtained for all slices, the entire worm gear set loading and stressing was determined. The geometric modelling method presented, allows tooth elastic deformation and tooth root stresses of worm gear drives under different load conditions to be investigated. On the basis of the method introduced in this study, the instantaneous meshing stiffness and load share were obtained. In comparison with existing methods, this approach has both good analytical accuracy and less computing time.Keywords: gear, load/stress distribution, worm, wheel, tooth stiffness, contact line
Procedia PDF Downloads 4221385 An Automatic Generating Unified Modelling Language Use Case Diagram and Test Cases Based on Classification Tree Method
Authors: Wassana Naiyapo, Atichat Sangtong
Abstract:
The processes in software development by Object Oriented methodology have many stages those take time and high cost. The inconceivable error in system analysis process will affect to the design and the implementation process. The unexpected output causes the reason why we need to revise the previous process. The more rollback of each process takes more expense and delayed time. Therefore, the good test process from the early phase, the implemented software is efficient, reliable and also meet the user’s requirement. Unified Modelling Language (UML) is the tool which uses symbols to describe the work process in Object Oriented Analysis (OOA). This paper presents the approach for automatically generated UML use case diagram and test cases. UML use case diagram is generated from the event table and test cases are generated from use case specifications and Graphic User Interfaces (GUI). Test cases are derived from the Classification Tree Method (CTM) that classify data to a node present in the hierarchy structure. Moreover, this paper refers to the program that generates use case diagram and test cases. As the result, it can reduce work time and increase efficiency work.Keywords: classification tree method, test case, UML use case diagram, use case specification
Procedia PDF Downloads 1611384 A Novel Software Model for Enhancement of System Performance and Security through an Optimal Placement of PMU and FACTS
Authors: R. Kiran, B. R. Lakshmikantha, R. V. Parimala
Abstract:
Secure operation of power systems requires monitoring of the system operating conditions. Phasor measurement units (PMU) are the device, which uses synchronized signals from the GPS satellites, and provide the phasors information of voltage and currents at a given substation. The optimal locations for the PMUs must be determined, in order to avoid redundant use of PMUs. The objective of this paper is to make system observable by using minimum number of PMUs & the implementation of stability software at 22OkV grid for on-line estimation of the power system transfer capability based on voltage and thermal limitations and for security monitoring. This software utilizes State Estimator (SE) and synchrophasor PMU data sets for determining the power system operational margin under normal and contingency conditions. This software improves security of transmission system by continuously monitoring operational margin expressed in MW or in bus voltage angles, and alarms the operator if the margin violates a pre-defined threshold.Keywords: state estimator (SE), flexible ac transmission systems (FACTS), optimal location, phasor measurement units (PMU)
Procedia PDF Downloads 4091383 Theoretical Studies on the Formation Constant, Geometry, Vibrational Frequencies and Electronic Properties Dinuclear Molybdenum Complexes
Authors: Mahboobeh Mohadeszadeh, Behzad Padidaran Moghaddam
Abstract:
In order to measuring dinuclear molybdenum complexes formation constant First,the reactants and the products were optimized separately and then, their frequencies were measured. In next level , with using Hartree-fock (HF) and density functional theory (DFT) methods ,Theoretical studies on the geometrical parameters, electronic properties and vibrational frequencies of dinuclear molybdenum complexes [C40H44Mo2N2O20] were investigated . These calculations were performed with the B3LYP, BPV86, B3PW91 and HF theoretical method using the LANL2DZ (for Mo’s) + 6-311G (for others) basis sets. To estimate the error rate between theoretical data and experimental data, RSquare , SError and RMS values that according with the theoretical and experimental parameters found out DFT methods has more integration with experimental data compare to HF methods. In addition, through electron specification of compounds, the percentage of atomic orbital’s attendance in making molecular orbital’s, atoms electrical charge, the sustainable energy resulting and also HOMO and LUMO orbital’s energy achieved.Keywords: geometrical parameters, hydrogen bonding, electronic properties, vibrational frequencies
Procedia PDF Downloads 2741382 Solubility of Water in CO2 Mixtures at Pipeline Operation Conditions
Authors: Mohammad Ahmad, Sander Gersen, Erwin Wilbers
Abstract:
Carbon capture, transport and underground storage have become a major solution to reduce CO2 emissions from power plants and other large CO2 sources. A big part of this captured CO2 stream is transported at high pressure dense phase conditions and stored in offshore underground depleted oil and gas fields. CO2 is also transported in offshore pipelines to be used for enhanced oil and gas recovery. The captured CO2 stream with impurities may contain water that causes severe corrosion problems, flow assurance failure and might damage valves and instrumentations. Thus, free water formation should be strictly prevented. The purpose of this work is to study the solubility of water in pure CO2 and in CO2 mixtures under real pipeline pressure (90-150 bar) and temperature operation conditions (5-35°C). A set up was constructed to generate experimental data. The results show the solubility of water in CO2 mixtures increasing with the increase of the temperature or/and with the increase in pressure. A drop in water solubility in CO2 is observed in the presence of impurities. The data generated were then used to assess the capabilities of two mixture models: the GERG-2008 model and the EOS-CG model. By generating the solubility data, this study contributes to determine the maximum allowable water content in CO2 pipelines.Keywords: carbon capture and storage, water solubility, equation of states, fluids engineering
Procedia PDF Downloads 2981381 Artificial Intelligence and Distributed System Computing: Application and Practice in Real Life
Authors: Lai Junzhe, Wang Lihao, Burra Venkata Durga Kumar
Abstract:
In recent years, due to today's global technological advances, big data and artificial intelligence technologies have been widely used in various industries and fields, playing an important role in reducing costs and increasing efficiency. Among them, artificial intelligence has derived another branch in its own continuous progress and the continuous development of computer personnel, namely distributed artificial intelligence computing systems. Distributed AI is a method for solving complex learning, decision-making, and planning problems, characterized by the ability to take advantage of large-scale computation and the spatial distribution of resources, and accordingly, it can handle problems with large data sets. Nowadays, distributed AI is widely used in military, medical, and human daily life and brings great convenience and efficient operation to life. In this paper, we will discuss three areas of distributed AI computing systems in vision processing, blockchain, and smart home to introduce the performance of distributed systems and the role of AI in distributed systems.Keywords: distributed system, artificial intelligence, blockchain, IoT, visual information processing, smart home
Procedia PDF Downloads 1111380 Investigations of Protein Aggregation Using Sequence and Structure Based Features
Authors: M. Michael Gromiha, A. Mary Thangakani, Sandeep Kumar, D. Velmurugan
Abstract:
The main cause of several neurodegenerative diseases such as Alzhemier, Parkinson, and spongiform encephalopathies is formation of amyloid fibrils and plaques in proteins. We have analyzed different sets of proteins and peptides to understand the influence of sequence-based features on protein aggregation process. The comparison of 373 pairs of homologous mesophilic and thermophilic proteins showed that aggregation-prone regions (APRs) are present in both. But, the thermophilic protein monomers show greater ability to ‘stow away’ the APRs in their hydrophobic cores and protect them from solvent exposure. The comparison of amyloid forming and amorphous b-aggregating hexapeptides suggested distinct preferences for specific residues at the six positions as well as all possible combinations of nine residue pairs. The compositions of residues at different positions and residue pairs have been converted into energy potentials and utilized for distinguishing between amyloid forming and amorphous b-aggregating peptides. Our method could correctly identify the amyloid forming peptides at an accuracy of 95-100% in different datasets of peptides.Keywords: aggregation, amyloids, thermophilic proteins, amino acid residues, machine learning techniques
Procedia PDF Downloads 6131379 Comparison of Petrophysical Relationship for Soil Water Content Estimation at Peat Soil Area Using GPR Common-Offset Measurements
Authors: Nurul Izzati Abd Karim, Samira Albati Kamaruddin, Rozaimi Che Hasan
Abstract:
The appropriate petrophysical relationship is needed for Soil Water Content (SWC) estimation especially when using Ground Penetrating Radar (GPR). Ground penetrating radar is a geophysical tool that provides indirectly the parameter of SWC. This paper examines the performance of few published petrophysical relationships to obtain SWC estimates from in-situ GPR common- offset survey measurements with gravimetric measurements at peat soil area. Gravimetric measurements were conducted to support of GPR measurements for the accuracy assessment. Further, GPR with dual frequencies (250MHhz and 700MHz) were used in the survey measurements to obtain the dielectric permittivity. Three empirical equations (i.e., Roth’s equation, Schaap’s equation and Idi’s equation) were selected for the study, used to compute the soil water content from dielectric permittivity of the GPR profile. The results indicate that Schaap’s equation provides strong correlation with SWC as measured by GPR data sets and gravimetric measurements.Keywords: common-offset measurements, ground penetrating radar, petrophysical relationship, soil water content
Procedia PDF Downloads 2511378 Cross Project Software Fault Prediction at Design Phase
Authors: Pradeep Singh, Shrish Verma
Abstract:
Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. The earlier we predict the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven data sets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.Keywords: software metrics, fault prediction, cross project, within project.
Procedia PDF Downloads 3411377 A Linear Programming Approach to Assist Roster Construction Under a Salary Cap
Authors: Alex Contarino
Abstract:
Professional sports leagues often have a “free agency” period, during which teams may sign players with expiring contracts.To promote parity, many leagues operate under a salary cap that limits the amount teams can spend on player’s salaries in a given year. Similarly, in fantasy sports leagues, salary cap drafts are a popular method for selecting players. In order to sign a free agent in either setting, teams must bid against one another to buy the player’s services while ensuring the sum of their player’s salaries is below the salary cap. This paper models the bidding process for a free agent as a constrained optimization problem that can be solved using linear programming. The objective is to determine the largest bid that a team should offer the player subject to the constraint that the value of signing the player must exceed the value of using the salary cap elsewhere. Iteratively solving this optimization problem for each available free agent provides teams with an effective framework for maximizing the talent on their rosters. The utility of this approach is demonstrated for team sport roster construction and fantasy sport drafts, using recent data sets from both settings.Keywords: linear programming, optimization, roster management, salary cap
Procedia PDF Downloads 1101376 Alternatives to the Disposal of Sludge from Water and Wastewater Treatment Plants
Authors: Lima Priscila, Gianotto Raiza, Arruda Leonan, Magalhães Filho Fernando
Abstract:
Industrialization and especially the accentuated population growth in developing countries and the lack of drainage, public cleaning, water and sanitation services has caused concern about the need for expansion of water treatment units and sewage. However, these units have been generating by-products, such as the sludge. This paper aims to investigate aspects of operation and maintenance of sludge from a wastewater treatment plant (WWTP - 90 L.s-1) and two water treatment plants (WTPs; 1.4 m3.s-1 and 0.5 m3.s-1) for the purpose of proper disposal and reuse, evaluating their qualitative and quantitative characteristics, the Brazilian legislation and standards. It was concluded that the sludge from the water treatment plants is directly related to the quality of raw water collected, and it becomes feasible for use in construction materials, and to dispose it in the sewage system, improving the efficiency of the WWTP regarding precipitation of phosphorus (35% of removal). The WTP Lageado had 55,726 kg/month of sludge production, more than WTP Guariroba (29,336 kg/month), even though the flow of WTP Guariroba is 1,400 L.s-1 and the WTP Lagedo 500 L.s-1, being explained by the quality that influences more than the flow. The WWTP sludge have higher concentrations of organic materials due to their origin and could be used to improve the fertility of the soil, crop production and recovery of degraded areas. The volume of sludge generated at the WWTP was 1,760 ton/month, with 5.6% of solid content in the raw sludge and in the dewatered sludge it increased its content to 23%.Keywords: disposal, sludge, water treatment, wastewater treatment
Procedia PDF Downloads 3181375 Effective Governance through Mobile Phones: Cases Supporting the Introduction and Implementation
Authors: Mohd Mudasir Shafi, Zafrul Hasan, Talat Saleem
Abstract:
Information and communication Technology (ICT) services have been defined as a route to good governance. Introduction of ICT into Governance has given rise to the idea of e-governance which helps in enhancing transparency, generating accountability and responsiveness in the system in order to provide faster and quality service to the citizen. Advancement in ICT has provided governments all over the world to speed up the delivery of information and services to citizens and businesses and increase their participation in governance. There has been varying degree of success over the past decade into providing services to the citizens using internet and different web services. These e-government initiatives have been extensively researched. Our research is aimed at the transition from electronic government to mobile government (m-government) initiatives implementing the mobile services and concerned to understand the major factors which will aid to adoption and distribution of these services. There must be some amount of research done in the integration process between e-government and m-government. There must also be enough amount of investigation done all the factors that could affect the transition process. Such factors differ between different places and the advancement in information and technology available there. In this paper, we have discussed why mobile communication system can be used for effective e-governance and the areas where m-governance can be implemented. The paper will examine some of the reasons as well as the main opportunities for improving effective governance through mobile phones.Keywords: e-governance, mobile phones, information technology, m-government
Procedia PDF Downloads 4421374 Organizational Climate being Knowledge Sharing Oriented: A Fuzzy-Set Analysis
Authors: Paulo Lopes Henriques, Carla Curado
Abstract:
According to literature, knowledge sharing behaviors are influenced by organizational values and structures, namely organizational climate. The manuscript examines the antecedents of the knowledge sharing oriented organizational climate. According to theoretical expectations the study adopts the following explanatory conditions: knowledge sharing costs, knowledge sharing incentives, perceptions of knowledge sharing contributing to performance and tenure. The study confronts results considering two groups of firms: nondigital (firms without intranet) vs digital (firms with intranet). The paper applies fsQCA technique to analyze data by using fsQCA 2.5 software (www.fsqca.com) testing several conditional arguments to explain the outcome variable. Main results strengthen claims on the relevancy of the contribution of knowledge sharing to performance. Secondly, evidence brings tenure - an explanatory condition that is associated to organizational memory – to the spotlight. The study provides an original contribution not previously addressed in literature, since it identifies the sufficient conditions sets to knowledge sharing oriented organizational climate using fsQCA, which is, to our knowledge, a novel application of the technique.Keywords: fsQCA, knowledge sharing oriented organizational climate, knowledge sharing costs, knowledge sharing incentives
Procedia PDF Downloads 3271373 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data
Authors: S. Nickolas, Shobha K.
Abstract:
The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing
Procedia PDF Downloads 2741372 Urban Resilince and Its Prioritised Components: Analysis of Industrial Township Greater Noida
Authors: N. Mehrotra, V. Ahuja, N. Sridharan
Abstract:
Resilience is an all hazard and a proactive approach, require a multidisciplinary input in the inter related variables of the city system. This research based to identify and operationalize indicators for assessment in domain of institutions, infrastructure and knowledge, all three operating in task oriented community networks. This paper gives a brief account of the methodology developed for assessment of Urban Resilience and its prioritized components for a target population within a newly planned urban complex integrating Surajpur and Kasna village as nodes. People’s perception of Urban Resilience has been examined by conducting questionnaire survey among the target population of Greater Noida. As defined by experts, Urban Resilience of a place is considered to be both a product and process of operation to regain normalcy after an event of disturbance of certain level. Based on this methodology, six indicators are identified that contribute to perception of urban resilience both as in the process of evolution and as an outcome. The relative significance of 6 R’ has also been identified. The dependency factor of various resilience indicators have been explored in this paper, which helps in generating new perspective for future research in disaster management. Based on the stated factors this methodology can be applied to assess urban resilience requirements of a well planned town, which is not an end in itself, but calls for new beginnings.Keywords: disaster, resilience, system, urban
Procedia PDF Downloads 4571371 Floodplain Modeling of River Jhelum Using HEC-RAS: A Case Study
Authors: Kashif Hassan, M.A. Ahanger
Abstract:
Floods have become more frequent and severe due to effects of global climate change and human alterations of the natural environment. Flood prediction/ forecasting and control is one of the greatest challenges facing the world today. The forecast of floods is achieved by the use of hydraulic models such as HEC-RAS, which are designed to simulate flow processes of the surface water. Extreme flood events in river Jhelum , lasting from a day to few are a major disaster in the State of Jammu and Kashmir, India. In the present study HEC-RAS model was applied to two different reaches of river Jhelum in order to estimate the flood levels corresponding to 25, 50 and 100 year return period flood events at important locations and to deduce flood vulnerability of important areas and structures. The flow rates for the two reaches were derived from flood-frequency analysis of 50 years of historic peak flow data. Manning's roughness coefficient n was selected using detailed analysis. Rating Curves were also generated to serve as base for determining the boundary conditions. Calibration and Validation procedures were applied in order to ensure the reliability of the model. Sensitivity analysis was also performed in order to ensure the accuracy of Manning's n in generating water surface profiles.Keywords: flood plain, HEC-RAS, Jhelum, return period
Procedia PDF Downloads 4251370 A Semiotic Approach to the Construction of Classical Identity in Indian Classical Music Videos
Authors: Jayakrishnan Narayanan, Sengamalam Periyasamy Dhanavel
Abstract:
Indian classical (Karnatik) music videos across various media platforms have followed an audio-visual pattern that conforms to its socio-cultural and quasi-religious identity. The present paper analyzes the semiotic variations between ‘pure Karnatik music videos’ and ‘independent/contemporary-collaborative music videos’ posted on social media by young professional Karnatik musicians. The paper analyzes these media texts by comparing their various structural sememes namely, the title, artists, music, narrative schemata, visuals, lighting, sound, and costumes. The paper argues that the pure Karnatik music videos are marked by the presence of certain recurring mythological or third level signifiers and that these signifiers and codes are marked by their conspicuous absence in the independent music videos produced by the same musicians. While the music and the musical instruments used in both these sets of music videos remain similar, the meaning that is abducted by the beholder in each case is entirely different. The paper also attempts to study the identity conflicts that are projected through these music videos and the extent to which the cultural connotations of Karnatik music govern the production of its music videos.Keywords: abduction, identity, media semiotics, music video
Procedia PDF Downloads 2191369 Influence of Optical Fluence Distribution on Photoacoustic Imaging
Authors: Mohamed K. Metwally, Sherif H. El-Gohary, Kyung Min Byun, Seung Moo Han, Soo Yeol Lee, Min Hyoung Cho, Gon Khang, Jinsung Cho, Tae-Seong Kim
Abstract:
Photoacoustic imaging (PAI) is a non-invasive and non-ionizing imaging modality that combines the absorption contrast of light with ultrasound resolution. Laser is used to deposit optical energy into a target (i.e., optical fluence). Consequently, the target temperature rises, and then thermal expansion occurs that leads to generating a PA signal. In general, most image reconstruction algorithms for PAI assume uniform fluence within an imaging object. However, it is known that optical fluence distribution within the object is non-uniform. This could affect the reconstruction of PA images. In this study, we have investigated the influence of optical fluence distribution on PA back-propagation imaging using finite element method. The uniform fluence was simulated as a triangular waveform within the object of interest. The non-uniform fluence distribution was estimated by solving light propagation within a tissue model via Monte Carlo method. The results show that the PA signal in the case of non-uniform fluence is wider than the uniform case by 23%. The frequency spectrum of the PA signal due to the non-uniform fluence has missed some high frequency components in comparison to the uniform case. Consequently, the reconstructed image with the non-uniform fluence exhibits a strong smoothing effect.Keywords: finite element method, fluence distribution, Monte Carlo method, photoacoustic imaging
Procedia PDF Downloads 3761368 Technique and Use of Machine Readable Dictionary: In Special Reference to Hindi-Marathi Machine Translation
Authors: Milind Patil
Abstract:
Present paper is a discussion on Hindi-Marathi Morphological Analysis and generating rules for Machine Translation on the basis of Machine Readable Dictionary (MRD). This used Transformative Generative Grammar (TGG) rules to design the MRD. As per TGG rules, the suffix of a particular root word is based on its Tense, Aspect, Modality and Voice. That's why the suffix is very important for the word meanings (or root meanings). The Hindi and Marathi Language both have relation with Indo-Aryan language family. Both have been derived from Sanskrit language and their script is 'Devnagari'. But there are lots of differences in terms of semantics and grammatical level too. In Marathi, there are three genders, but in Hindi only two (Masculine and Feminine), the Natural gender is absent in Hindi. Likewise other grammatical categories also differ in their level of use. For MRD the suffixes (or Morpheme) are of particular root word for GNP (Gender, Number and Person) are based on its natural phenomena. A particular Suffix and Morphine change as per the need of person, number and gender. The design of MRD also based on this format. In first, Person, Number, Gender and Tense are key points than root words and suffix of particular Person, Number Gender (PNG). After that the inferences are drawn on the basis of rules that is (V.stem) (Pre.T/Past.T) (x) + (Aux-Pre.T) (x) → (V.Stem.) + (SP.TM) (X).Keywords: MRD, TGG, stem, morph, morpheme, suffix, PNG, TAM&V, root
Procedia PDF Downloads 3231367 A Genetic Algorithm for the Load Balance of Parallel Computational Fluid Dynamics Computation with Multi-Block Structured Mesh
Authors: Chunye Gong, Ming Tie, Jie Liu, Weimin Bao, Xinbiao Gan, Shengguo Li, Bo Yang, Xuguang Chen, Tiaojie Xiao, Yang Sun
Abstract:
Large-scale CFD simulation relies on high-performance parallel computing, and the load balance is the key role which affects the parallel efficiency. This paper focuses on the load-balancing problem of parallel CFD simulation with structured mesh. A mathematical model for this load-balancing problem is presented. The genetic algorithm, fitness computing, two-level code are designed. Optimal selector, robust operator, and local optimization operator are designed. The properties of the presented genetic algorithm are discussed in-depth. The effects of optimal selector, robust operator, and local optimization operator are proved by experiments. The experimental results of different test sets, DLR-F4, and aircraft design applications show the presented load-balancing algorithm is robust, quickly converged, and is useful in real engineering problems.Keywords: genetic algorithm, load-balancing algorithm, optimal variation, local optimization
Procedia PDF Downloads 1811366 Transient Analysis and Mitigation of Capacitor Bank Switching on a Standalone Wind Farm
Authors: Ajibola O. Akinrinde, Andrew Swanson, Remy Tiako
Abstract:
There exist significant losses on transmission lines due to distance, as power generating stations could be located far from some isolated settlements. Standalone wind farms could be a good choice of alternative power generation for such settlements that are far from the grid due to factors of long distance or socio-economic problems. However, uncompensated wind farms consume reactive power since wind turbines are induction generators. Therefore, capacitor banks are used to compensate reactive power, which in turn improves the voltage profile of the network. Although capacitor banks help improving voltage profile, they also undergo switching actions due to its compensating response to the variation of various types of load at the consumer’s end. These switching activities could cause transient overvoltage on the network, jeopardizing the end-life of other equipment on the system. In this paper, the overvoltage caused by these switching activities is investigated using the IEEE bus 14-network to represent a standalone wind farm, and the simulation is done using ATP/EMTP software. Scenarios involving the use of pre-insertion resistor and pre-insertion inductor, as well as controlled switching was also carried out in order to decide the best mitigation option to reduce the overvoltage.Keywords: capacitor banks, IEEE bus 14-network, pre-insertion resistor, standalone wind farm
Procedia PDF Downloads 4401365 The Impact of Technology on Sales Researches and Distribution
Authors: Nady Farag Faragalla Hanna
Abstract:
In the car dealership industry in Japan, the sales specialist is a key factor in the success of the company. I hypothesize that when a company understands the characteristics of sales professionals in its industry, it is easier to recruit and train salespeople effectively. Lean human resources management ensures the economic success and performance of companies, especially small and medium-sized companies.The purpose of the article is to determine the characteristics of sales specialists for small and medium-sized car dealerships using the chi-square test and the proximate variable model. Accordingly, the results show that career change experience, learning ability and product knowledge are important, while university education, career building through internal transfer, leadership experience and people development are not important for becoming a sales professional. I also show that the characteristics of sales specialists are perseverance, humility, improvisation and passion for business.Keywords: electronics engineering, marketing, sales, E-commerce digitalization, interactive systems, sales process ARIMA models, sales demand forecasting, time series, R codetraits of sales professionals, variable precision rough sets theory, sales professional, sales professionals
Procedia PDF Downloads 511364 Efficient Tuning Parameter Selection by Cross-Validated Score in High Dimensional Models
Authors: Yoonsuh Jung
Abstract:
As DNA microarray data contain relatively small sample size compared to the number of genes, high dimensional models are often employed. In high dimensional models, the selection of tuning parameter (or, penalty parameter) is often one of the crucial parts of the modeling. Cross-validation is one of the most common methods for the tuning parameter selection, which selects a parameter value with the smallest cross-validated score. However, selecting a single value as an "optimal" value for the parameter can be very unstable due to the sampling variation since the sample sizes of microarray data are often small. Our approach is to choose multiple candidates of tuning parameter first, then average the candidates with different weights depending on their performance. The additional step of estimating the weights and averaging the candidates rarely increase the computational cost, while it can considerably improve the traditional cross-validation. We show that the selected value from the suggested methods often lead to stable parameter selection as well as improved detection of significant genetic variables compared to the tradition cross-validation via real data and simulated data sets.Keywords: cross validation, parameter averaging, parameter selection, regularization parameter search
Procedia PDF Downloads 4131363 Parameter Selection for Computationally Efficient Use of the Bfvrns Fully Homomorphic Encryption Scheme
Authors: Cavidan Yakupoglu, Kurt Rohloff
Abstract:
In this study, we aim to provide a novel parameter selection model for the BFVrns scheme, which is one of the prominent FHE schemes. Parameter selection in lattice-based FHE schemes is a practical challenges for experts or non-experts. Towards a solution to this problem, we introduce a hybrid principles-based approach that combines theoretical with experimental analyses. To begin, we use regression analysis to examine the parameters on the performance and security. The fact that the FHE parameters induce different behaviors on performance, security and Ciphertext Expansion Factor (CEF) that makes the process of parameter selection more challenging. To address this issue, We use a multi-objective optimization algorithm to select the optimum parameter set for performance, CEF and security at the same time. As a result of this optimization, we get an improved parameter set for better performance at a given security level by ensuring correctness and security against lattice attacks by providing at least 128-bit security. Our result enables average ~ 5x smaller CEF and mostly better performance in comparison to the parameter sets given in [1]. This approach can be considered a semiautomated parameter selection. These studies are conducted using the PALISADE homomorphic encryption library, which is a well-known HE library. The abstract goes here.Keywords: lattice cryptography, fully homomorphic encryption, parameter selection, LWE, RLWE
Procedia PDF Downloads 1521362 MapReduce Logistic Regression Algorithms with RHadoop
Authors: Byung Ho Jung, Dong Hoon Lim
Abstract:
Logistic regression is a statistical method for analyzing a dataset in which there are one or more independent variables that determine an outcome. Logistic regression is used extensively in numerous disciplines, including the medical and social science fields. In this paper, we address the problem of estimating parameters in the logistic regression based on MapReduce framework with RHadoop that integrates R and Hadoop environment applicable to large scale data. There exist three learning algorithms for logistic regression, namely Gradient descent method, Cost minimization method and Newton-Rhapson's method. The Newton-Rhapson's method does not require a learning rate, while gradient descent and cost minimization methods need to manually pick a learning rate. The experimental results demonstrated that our learning algorithms using RHadoop can scale well and efficiently process large data sets on commodity hardware. We also compared the performance of our Newton-Rhapson's method with gradient descent and cost minimization methods. The results showed that our newton's method appeared to be the most robust to all data tested.Keywords: big data, logistic regression, MapReduce, RHadoop
Procedia PDF Downloads 2801361 Stack Overflow Detection and Prevention on Operating Systems Using Machine Learning and Control-Flow Enforcement Technology
Authors: Cao Jiayu, Lan Ximing, Huang Jingjia, Burra Venkata Durga Kumar
Abstract:
The first virus to attack personal computers was born in early 1986, called C-Brain, written by a pair of Pakistani brothers. In those days, people still used dos systems, manipulating computers with the most basic command lines. In the 21st century today, computer performance has grown geometrically. But computer viruses are also evolving and escalating. We never stop fighting against security problems. Stack overflow is one of the most common security vulnerabilities in operating systems. It may result in serious security issues for an operating system if a program in it has a vulnerability with administrator privileges. Certain viruses change the value of specific memory through a stack overflow, allowing computers to run harmful programs. This study developed a mechanism to detect and respond to time whenever a stack overflow occurs. We demonstrate the effectiveness of standard machine learning algorithms and control flow enforcement techniques in predicting computer OS security using generating suspicious vulnerability functions (SVFS) and associated suspect areas (SAS). The method can minimize the possibility of stack overflow attacks occurring.Keywords: operating system, security, stack overflow, buffer overflow, machine learning, control-flow enforcement technology
Procedia PDF Downloads 1131360 A Model of Foam Density Prediction for Expanded Perlite Composites
Authors: M. Arifuzzaman, H. S. Kim
Abstract:
Multiple sets of variables associated with expanded perlite particle consolidation in foam manufacturing were analyzed to develop a model for predicting perlite foam density. The consolidation of perlite particles based on the flotation method and compaction involves numerous variables leading to the final perlite foam density. The variables include binder content, compaction ratio, perlite particle size, various perlite particle densities and porosities, and various volumes of perlite at different stages of process. The developed model was found to be useful not only for prediction of foam density but also for optimization between compaction ratio and binder content to achieve a desired density. Experimental verification was conducted using a range of foam densities (0.15–0.5 g/cm3) produced with a range of compaction ratios (1.5-3.5), a range of sodium silicate contents (0.05–0.35 g/ml) in dilution, a range of expanded perlite particle sizes (1-4 mm), and various perlite densities (such as skeletal, material, bulk, and envelope densities). A close agreement between predictions and experimental results was found.Keywords: expanded perlite, flotation method, foam density, model, prediction, sodium silicate
Procedia PDF Downloads 4061359 Suppression Subtractive Hybridization Technique for Identification of the Differentially Expressed Genes
Authors: Tuhina-khatun, Mohamed Hanafi Musa, Mohd Rafii Yosup, Wong Mui Yun, Aktar-uz-Zaman, Mahbod Sahebi
Abstract:
Suppression subtractive hybridization (SSH) method is valuable tool for identifying differentially regulated genes in disease specific or tissue specific genes important for cellular growth and differentiation. It is a widely used method for separating DNA molecules that distinguish two closely related DNA samples. SSH is one of the most powerful and popular methods for generating subtracted cDNA or genomic DNA libraries. It is based primarily on a suppression polymerase chain reaction (PCR) technique and combines normalization and subtraction in a solitary procedure. The normalization step equalizes the abundance of DNA fragments within the target population, and the subtraction step excludes sequences that are common to the populations being compared. This dramatically increases the probability of obtaining low-abundance differentially expressed cDNAs or genomic DNA fragments and simplifies analysis of the subtracted library. SSH technique is applicable to many comparative and functional genetic studies for the identification of disease, developmental, tissue specific, or other differentially expressed genes, as well as for the recovery of genomic DNA fragments distinguishing the samples under comparison.Keywords: suppression subtractive hybridization, differentially expressed genes, disease specific genes, tissue specific genes
Procedia PDF Downloads 4311358 Implementation of a Web-Based Clinical Outcomes Monitoring and Reporting Platform across the Fortis Network
Authors: Narottam Puri, Bishnu Panigrahi, Narayan Pendse
Abstract:
Background: Clinical Outcomes are the globally agreed upon, evidence-based measurable changes in health or quality of life resulting from the patient care. Reporting of outcomes and its continuous monitoring provides an opportunity for both assessing and improving the quality of patient care. In 2012, International Consortium Of HealthCare Outcome Measurement (ICHOM) was founded which has defined global Standard Sets for measuring the outcome of various treatments. Method: Monitoring of Clinical Outcomes was identified as a pillar of Fortis’ core value of Patient Centricity. The project was started as an in-house developed Clinical Outcomes Reporting Portal by the Fortis Medical IT team. Standard sets of Outcome measurement developed by ICHOM were used. A pilot was run at Fortis Escorts Heart Institute from Aug’13 – Dec’13.Starting Jan’14, it was implemented across 11 hospitals of the group. The scope was hospital-wide and major clinical specialties: Cardiac Sciences, Orthopedics & Joint Replacement were covered. The internally developed portal had its limitations of report generation and also capturing of Patient related outcomes was restricted. A year later, the company provisioned for an ICHOM Certified Software product which could provide a platform for data capturing and reporting to ensure compliance with all ICHOM requirements. Post a year of the launch of the software; Fortis Healthcare has become the 1st Healthcare Provider in Asia to publish Clinical Outcomes data for the Coronary Artery Disease Standard Set comprising of Coronary Artery Bypass Graft and Percutaneous Coronary Interventions) in the public domain. (Jan 2016). Results: This project has helped in firmly establishing a culture of monitoring and reporting Clinical Outcomes across Fortis Hospitals. Given the diverse nature of the healthcare delivery model at Fortis Network, which comprises of hospitals of varying size and specialty-mix and practically covering the entire span of the country, standardization of data collection and reporting methodology is a huge achievement in itself. 95% case reporting was achieved with more than 90% data completion at the end of Phase 1 (March 2016). Post implementation the group now has one year of data from its own hospitals. This has helped identify the gaps and plan towards ways to bridge them and also establish internal benchmarks for continual improvement. Besides the value created for the group includes: 1. Entire Fortis community has been sensitized on the importance of Clinical Outcomes monitoring for patient centric care. Initial skepticism and cynicism has been countered by effective stakeholder engagement and automation of processes. 2. Measuring quality is the first step in improving quality. Data analysis has helped compare clinical results with best-in-class hospitals and identify improvement opportunities. 3. Clinical fraternity is extremely pleased to be part of this initiative and has taken ownership of the project. Conclusion: Fortis Healthcare is the pioneer in the monitoring of Clinical Outcomes. Implementation of ICHOM standards has helped Fortis Clinical Excellence Program in improving patient engagement and strengthening its commitment to its core value of Patient Centricity. Validation and certification of the Clinical Outcomes data by an ICHOM Certified Supplier adds confidence to its claim of being leaders in this space.Keywords: clinical outcomes, healthcare delivery, patient centricity, ICHOM
Procedia PDF Downloads 2361357 Challenges of Technical and Engineering Students in the Application of Scientific Cancer Knowledge to Preserve the Future Generation in Sub-Saharan Africa
Authors: K. Shaloom Mbambu, M. Pascal Tshimbalanga, K. Ruth Mutala, K. Roger Kabuya, N. Dieudonné Kabeya, Y. L. Kabeya Mukeba
Abstract:
In this article, the authors examine the even more worrying situation of girls in sub-Saharan Africa. Two-girls on five are private of Global Education, which represents a real loss to the development of communities and countries. Cultural traditions, poverty, violence, early and forced marriages, early pregnancies, and many other gender inequalities were the causes of this cancer development. Namely, "it is no more efficient development tool that is educating girls." The non-schooling of girls and their lack of supervision by liberal professions have serious consequences for the life of each of them. To improve the conditions of their inferior status, girls to men introduce poverty and health risks. Raising awareness among parents and communities on the importance of girls' education, improving children's access to school, girl-boy equality with their rights, creating income, and generating activities for girls, girls, and girls learning of liberal trades to make them self-sufficient. Organizations such as the United Nations Organization can save the children. ASEAD and the AEDA group are predicting the impact of this cancer on the development of a nation's future generation must be preserved.Keywords: young girl, Sub-Saharan Africa, higher and vocational education, development, society, environment
Procedia PDF Downloads 252