Search results for: large firms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7626

Search results for: large firms

5556 Chinese Event Detection Technique Based on Dependency Parsing and Rule Matching

Authors: Weitao Lin

Abstract:

To quickly extract adequate information from large-scale unstructured text data, this paper studies the representation of events in Chinese scenarios and performs the regularized abstraction. It proposes a Chinese event detection technique based on dependency parsing and rule matching. The method first performs dependency parsing on the original utterance, then performs pattern matching at the word or phrase granularity based on the results of dependent syntactic analysis, filters out the utterances with prominent non-event characteristics, and obtains the final results. The experimental results show the effectiveness of the method.

Keywords: natural language processing, Chinese event detection, rules matching, dependency parsing

Procedia PDF Downloads 125
5555 Predictive Modelling of Aircraft Component Replacement Using Imbalanced Learning and Ensemble Method

Authors: Dangut Maren David, Skaf Zakwan

Abstract:

Adequate monitoring of vehicle component in other to obtain high uptime is the goal of predictive maintenance, the major challenge faced by businesses in industries is the significant cost associated with a delay in service delivery due to system downtime. Most of those businesses are interested in predicting those problems and proactively prevent them in advance before it occurs, which is the core advantage of Prognostic Health Management (PHM) application. The recent emergence of industry 4.0 or industrial internet of things (IIoT) has led to the need for monitoring systems activities and enhancing system-to-system or component-to- component interactions, this has resulted to a large generation of data known as big data. Analysis of big data represents an increasingly important, however, due to complexity inherently in the dataset such as imbalance classification problems, it becomes extremely difficult to build a model with accurate high precision. Data-driven predictive modeling for condition-based maintenance (CBM) has recently drowned research interest with growing attention to both academics and industries. The large data generated from industrial process inherently comes with a different degree of complexity which posed a challenge for analytics. Thus, imbalance classification problem exists perversely in industrial datasets which can affect the performance of learning algorithms yielding to poor classifier accuracy in model development. Misclassification of faults can result in unplanned breakdown leading economic loss. In this paper, an advanced approach for handling imbalance classification problem is proposed and then a prognostic model for predicting aircraft component replacement is developed to predict component replacement in advanced by exploring aircraft historical data, the approached is based on hybrid ensemble-based method which improves the prediction of the minority class during learning, we also investigate the impact of our approach on multiclass imbalance problem. We validate the feasibility and effectiveness in terms of the performance of our approach using real-world aircraft operation and maintenance datasets, which spans over 7 years. Our approach shows better performance compared to other similar approaches. We also validate our approach strength for handling multiclass imbalanced dataset, our results also show good performance compared to other based classifiers.

Keywords: prognostics, data-driven, imbalance classification, deep learning

Procedia PDF Downloads 162
5554 Robust Barcode Detection with Synthetic-to-Real Data Augmentation

Authors: Xiaoyan Dai, Hsieh Yisan

Abstract:

Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.

Keywords: barcode detection, data augmentation, deep learning, image-based processing

Procedia PDF Downloads 154
5553 Green Organic Chemistry, a New Paradigm in Pharmaceutical Sciences

Authors: Pesaru Vigneshwar Reddy, Parvathaneni Pavan

Abstract:

Green organic chemistry which is the latest and one of the most researched topics now-a- days has been in demand since 1990’s. Majority of the research in green organic chemistry chemicals are some of the important starting materials for greater number of major chemical industries. The production of organic chemicals has raw materials (or) reagents for other application is major sector of manufacturing polymers, pharmaceuticals, pesticides, paints, artificial fibers, food additives etc. organic synthesis on a large scale compound to the labratory scale, involves the use of energy, basic chemical ingredients from the petro chemical sectors, catalyst and after the end of the reaction, seperation, purification, storage, packing distribution etc. During these processes there are many problems of health and safety for workers in addition to the environmental problems caused there by use and deposition as waste. Green chemistry with its 12 principles would like to see changes in conventional way that were used for decades to make synthetic organic chemical and the use of less toxic starting materials. Green chemistry would like to increase the efficiency of synthetic methods, to use less toxic solvents, reduce the stage of synthetic routes and minimize waste as far as practically possible. In this way, organic synthesis will be part of the effort for sustainable development Green chemistry is also interested for research and alternatives innovations on many practical aspects of organic synthesis in the university and research labaratory of institutions. By changing the methodologies of organic synthesis, health and safety will be advanced in the small scale laboratory level but also will be extended to the industrial large scale production a process through new techniques. The three key developments in green chemistry include the use of super critical carbondioxide as green solvent, aqueous hydrogen peroxide as an oxidising agent and use of hydrogen in asymmetric synthesis. It also focuses on replacing traditional methods of heating with that of modern methods of heating like microwaves traditions, so that carbon foot print should reduces as far as possible. Another beneficiary of this green chemistry is that it will reduce environmental pollution through the use of less toxic reagents, minimizing of waste and more bio-degradable biproducts. In this present paper some of the basic principles, approaches, and early achievements of green chemistry has a branch of chemistry that studies the laws of passing of chemical reactions is also considered, with the summarization of green chemistry principles. A discussion about E-factor, old and new synthesis of ibuprofen, microwave techniques, and some of the recent advancements also considered.

Keywords: energy, e-factor, carbon foot print, micro-wave, sono-chemistry, advancement

Procedia PDF Downloads 279
5552 Evaluation of Soil Erosion Risk and Prioritization for Implementation of Management Strategies in Morocco

Authors: Lahcen Daoudi, Fatima Zahra Omdi, Abldelali Gourfi

Abstract:

In Morocco, as in most Mediterranean countries, water scarcity is a common situation because of low and unevenly distributed rainfall. The expansions of irrigated lands, as well as the growth of urban and industrial areas and tourist resorts, contribute to an increase of water demand. Therefore in the 1960s Morocco embarked on an ambitious program to increase the number of dams to boost water retention capacity. However, the decrease in the capacity of these reservoirs caused by sedimentation is a major problem; it is estimated at 75 million m3/year. Dams and reservoirs became unusable for their intended purposes due to sedimentation in large rivers that result from soil erosion. Soil erosion presents an important driving force in the process affecting the landscape. It has become one of the most serious environmental problems that raised much interest throughout the world. Monitoring soil erosion risk is an important part of soil conservation practices. The estimation of soil loss risk is the first step for a successful control of water erosion. The aim of this study is to estimate the soil loss risk and its spatial distribution in the different fields of Morocco and to prioritize areas for soil conservation interventions. The approach followed is the Revised Universal Soil Loss Equation (RUSLE) using remote sensing and GIS, which is the most popular empirically based model used globally for erosion prediction and control. This model has been tested in many agricultural watersheds in the world, particularly for large-scale basins due to the simplicity of the model formulation and easy availability of the dataset. The spatial distribution of the annual soil loss was elaborated by the combination of several factors: rainfall erosivity, soil erodability, topography, and land cover. The average annual soil loss estimated in several basins watershed of Morocco varies from 0 to 50t/ha/year. Watersheds characterized by high-erosion-vulnerability are located in the North (Rif Mountains) and more particularly in the Central part of Morocco (High Atlas Mountains). This variation of vulnerability is highly correlated to slope variation which indicates that the topography factor is the main agent of soil erosion within these basin catchments. These results could be helpful for the planning of natural resources management and for implementing sustainable long-term management strategies which are necessary for soil conservation and for increasing over the projected economic life of the dam implemented.

Keywords: soil loss, RUSLE, GIS-remote sensing, watershed, Morocco

Procedia PDF Downloads 449
5551 The Next Generation Neutrinoless Double-Beta Decay Experiment nEXO

Authors: Ryan Maclellan

Abstract:

The nEXO Collaboration is designing a very large detector for neutrinoless double beta decay of Xe-136. The nEXO detector is rooted in the current EXO-200 program, which has reached a sensitivity for the half-life of the decay of 1.9x10^25 years with an exposure of 99.8 kg-y. The baseline nEXO design assumes 5 tonnes of liquid xenon, enriched in the mass 136 isotope, within a time projection chamber. The detector is being designed to reach a half-life sensitivity of > 5x10^27 years covering the inverted neutrino mass hierarchy, with 5 years of data. We present the nEXO detector design, the current status of R&D efforts, and the physics case for the experiment.

Keywords: double-beta, Majorana, neutrino, neutrinoless

Procedia PDF Downloads 410
5550 Analytical Solution for Stellar Distance Based on Photon Dominated Cosmic Expansion Model

Authors: Xiaoyun Li, Suoang Longzhou

Abstract:

This paper derives the analytical solution of stellar distance according to its redshift based on the photon-dominated universe expansion model. Firstly, it calculates stellar separation speed and the farthest distance of observable stars via simulation. Then the analytical solution of stellar distance according to its redshift is derived. It shows that when the redshift is large, the stellar distance (and its separation speed) is not proportional to its redshift due to the relativity effect. It also reveals the relationship between stellar age and its redshift. The correctness of the analytical solution is verified by the latest astronomic observations of Ia supernovas in 2020.

Keywords: redshift, cosmic expansion model, analytical solution, stellar distance

Procedia PDF Downloads 153
5549 ARGO: An Open Designed Unmanned Surface Vehicle Mapping Autonomous Platform

Authors: Papakonstantinou Apostolos, Argyrios Moustakas, Panagiotis Zervos, Dimitrios Stefanakis, Manolis Tsapakis, Nektarios Spyridakis, Mary Paspaliari, Christos Kontos, Antonis Legakis, Sarantis Houzouris, Konstantinos Topouzelis

Abstract:

For years unmanned and remotely operated robots have been used as tools in industry research and education. The rapid development and miniaturization of sensors that can be attached to remotely operated vehicles in recent years allowed industry leaders and researchers to utilize them as an affordable means for data acquisition in air, land, and sea. Despite the recent developments in the ground and unmanned airborne vehicles, a small number of Unmanned Surface Vehicle (USV) platforms are targeted for mapping and monitoring environmental parameters for research and industry purposes. The ARGO project is developed an open-design USV equipped with multi-level control hardware architecture and state-of-the-art sensors and payloads for the autonomous monitoring of environmental parameters in large sea areas. The proposed USV is a catamaran-type USV controlled over a wireless radio link (5G) for long-range mapping capabilities and control for a ground-based control station. The ARGO USV has a propulsion control using 2x fully redundant electric trolling motors with active vector thrust for omnidirectional movement, navigation with opensource autopilot system with high accuracy GNSS device, and communication with the 2.4Ghz digital link able to provide 20km of Line of Sight (Los) range distance. The 3-meter dual hull design and composite structure offer well above 80kg of usable payload capacity. Furthermore, sun and friction energy harvesting methods provide clean energy to the propulsion system. The design is highly modular, where each component or payload can be replaced or modified according to the desired task (industrial or research). The system can be equipped with Multiparameter Sonde, measuring up to 20 water parameters simultaneously, such as conductivity, salinity, turbidity, dissolved oxygen, etc. Furthermore, a high-end multibeam echo sounder can be installed in a specific boat datum for shallow water high-resolution seabed mapping. The system is designed to operate in the Aegean Sea. The developed USV is planned to be utilized as a system for autonomous data acquisition, mapping, and monitoring bathymetry and various environmental parameters. ARGO USV can operate in small or large ports with high maneuverability and endurance to map large geographical extends at sea. The system presents state of the art solutions in the following areas i) the on-board/real-time data processing/analysis capabilities, ii) the energy-independent and environmentally friendly platform entirely made using the latest aeronautical and marine materials, iii) the integration of advanced technology sensors, all in one system (photogrammetric and radiometric footprint, as well as its connection with various environmental and inertial sensors) and iv) the information management application. The ARGO web-based application enables the system to depict the results of the data acquisition process in near real-time. All the recorded environmental variables and indices are presented, allowing users to remotely access all the raw and processed information using the implemented web-based GIS application.

Keywords: monitor marine environment, unmanned surface vehicle, mapping bythometry, sea environmental monitoring

Procedia PDF Downloads 125
5548 Integrated Geophysical Surveys for Sinkhole and Subsidence Vulnerability Assessment, in the West Rand Area of Johannesburg

Authors: Ramoshweu Melvin Sethobya, Emmanuel Chirenje, Mihlali Hobo, Simon Sebothoma

Abstract:

The recent surge in residential infrastructure development around the metropolitan areas of South Africa has necessitated conditions for thorough geotechnical assessments to be conducted prior to site developments to ensure human and infrastructure safety. This paper appraises the success in the application of multi-method geophysical techniques for the delineation of sinkhole vulnerability in a residential landscape. Geophysical techniques ERT, MASW, VES, Magnetics and gravity surveys were conducted to assist in mapping sinkhole vulnerability, using an existing sinkhole as a constraint at Venterspost town, West of Johannesburg city. A combination of different geophysical techniques and results integration from those proved to be useful in the delineation of the lithologic succession around sinkhole locality, and determining the geotechnical characteristics of each layer for its contribution to the development of sinkholes, subsidence and cavities at the vicinity of the site. Study results have also assisted in the determination of the possible depth extension of the currently existing sinkhole and the location of sites where other similar karstic features and sinkholes could form. Results of the ERT, VES and MASW surveys have uncovered dolomitic bedrock at varying depths around the sites, which exhibits high resistivity values in the range 2500-8000ohm.m and corresponding high velocities in the range 1000-2400 m/s. The dolomite layer was found to be overlain by a weathered chert-poor dolomite layer, which has resistivities between the range 250-2400ohm.m, and velocities ranging from 500-600m/s, from which the large sinkhole has been found to collapse/ cave in. A compiled 2.5D high resolution Shear Wave Velocity (Vs) map of the study area was created using 2D profiles of MASW data, offering insights into the prevailing lithological setup conducive for formation various types of karstic features around the site. 3D magnetic models of the site highlighted the regions of possible subsurface interconnections between the currently existing large sinkhole and the other subsidence feature at the site. A number of depth slices were used to detail the conditions near the sinkhole as depth increases. Gravity surveys results mapped the possible formational pathways for development of new karstic features around the site. Combination and correlation of different geophysical techniques proved useful in delineation of the site geotechnical characteristics and mapping the possible depth extend of the currently existing sinkhole.

Keywords: resistivity, magnetics, sinkhole, gravity, karst, delineation, VES

Procedia PDF Downloads 64
5547 Strategic Public Procurement: A Lever for Social Entrepreneurship and Innovation

Authors: B. Orser, A. Riding, Y. Li

Abstract:

To inform government about how gender gaps in SME ( small and medium-sized enterprise) contracting might be redressed, the research question was: What are the key obstacles to, and response strategies for, increasing the engagement of women business owners among SME suppliers to the government of Canada? Thirty-five interviews with senior policymakers, supplier diversity organization executives, and expert witnesses to the Canadian House of Commons, Standing Committee on Government Operations and Estimates. Qualitative data were conducted and analysed using N’Vivo 11 software. High order response categories included: (a) SME risk mitigation strategies, (b) SME procurement program design, and (c) performance measures. Primary obstacles cited were government red tape and long and complicated requests for proposals (RFPs). The majority of 'common' complaints occur when SMEs have questions about the federal procurement process. Witness responses included use of outcome-based rather than prescriptive procurement practices, more agile procurement, simplified RFPs, making payment within 30 days a procurement priority. Risk mitigation strategies included provision of procurement officers to assess risks and opportunities for businesses and development of more agile procurement procedures and processes. Recommendations to enhance program design included: improved definitional consistency of qualifiers and selection criteria, better co-ordination across agencies; clarification about how SME suppliers benefit from federal contracting; goal setting; specification of categories that are most suitable for women-owned businesses; and, increasing primary contractor awareness about the importance of subcontract relationships. Recommendations also included third-party certification of eligible firms and the need to enhance SMEs’ financial literacy to reduce financial errors. Finally, there remains the need for clear and consistent pre-program statistics to establish baselines (by sector, issuing department) performance measures, targets based on percentage of contracts granted, value of contract, percentage of target employee (women, indigenous), and community benefits including hiring local employees. The study advances strategies to enhance federal procurement programs to facilitate socio-economic policy objectives.

Keywords: procurement, small business, policy, women

Procedia PDF Downloads 105
5546 Deployment of Attack Helicopters in Conventional Warfare: The Gulf War

Authors: Mehmet Karabekir

Abstract:

Attack helicopters (AHs) are usually deployed in conventional warfare to destroy armored and mechanized forces of enemy. In addition, AHs are able to perform various tasks in the deep, and close operations – intelligence, surveillance, reconnaissance, air assault operations, and search and rescue operations. Apache helicopters were properly employed in the Gulf Wars and contributed the success of campaign by destroying a large number of armored and mechanized vehicles of Iraq Army. The purpose of this article is to discuss the deployment of AHs in conventional warfare in the light of Gulf Wars. First, the employment of AHs in deep and close operations will be addressed regarding the doctrine. Second, the US armed forces AH-64 doctrinal and tactical usage will be argued in the 1st and 2nd Gulf Wars.

Keywords: attack helicopter, conventional warfare, gulf wars

Procedia PDF Downloads 464
5545 Sparse Principal Component Analysis: A Least Squares Approximation Approach

Authors: Giovanni Merola

Abstract:

Sparse Principal Components Analysis aims to find principal components with few non-zero loadings. We derive such sparse solutions by adding a genuine sparsity requirement to the original Principal Components Analysis (PCA) objective function. This approach differs from others because it preserves PCA's original optimality: uncorrelatedness of the components and least squares approximation of the data. To identify the best subset of non-zero loadings we propose a branch-and-bound search and an iterative elimination algorithm. This last algorithm finds sparse solutions with large loadings and can be run without specifying the cardinality of the loadings and the number of components to compute in advance. We give thorough comparisons with the existing sparse PCA methods and several examples on real datasets.

Keywords: SPCA, uncorrelated components, branch-and-bound, backward elimination

Procedia PDF Downloads 367
5544 Industrial Wastewater from Paper Mills Used for Biofuel Production and Soil Improvement

Authors: Karin M. Granstrom

Abstract:

Paper mills produce wastewater with a high content of organic substances. Treatment usually consists of sedimentation, biological treatment of activated sludge basins, and chemical precipitation. The resulting sludges are currently a waste problem, deposited in landfills or used as low-grade fuels for incineration. There is a growing awareness of the need for energy efficiency and environmentally sound management of sludge. A resource-efficient method would be to digest the wastewater sludges anaerobically to produce biogas, refine the biogas to biomethane for use in the transportation sector, and utilize the resulting digestate for soil improvement. The biomethane yield of pulp and paper wastewater sludge is comparable to that of straw or manure. As a bonus, the digestate has an improved dewaterability compared to the feedstock biosludge. Limitations of this process are predominantly a weak economic viability - necessitating both sufficiently large-scale paper production for the necessary large amounts of produced wastewater sludge, and the resolving of remaining questions on the certifiability of the digestate and thus its sales price. A way to improve the practical and economical feasibility of using paper mill wastewater for biomethane production and soil improvement is to co-digest it with other feedstocks. In this study, pulp and paper sludge were co-digested with (1) silage and manure, (2) municipal sewage sludge, (3) food waste, or (4) microalgae. Biomethane yield analysis was performed in 500 ml batch reactors, using an Automatic Methane Potential Test System at thermophilic temperature, with a 20 days test duration. The results show that (1) the harvesting season of grass silage and manure collection was an important factor for methane production, with spring feedstocks producing much more than autumn feedstock, and pulp mill sludge benefitting the most from co-digestion; (2) pulp and paper mill sludge is a suitable co-substrate to add when a high nitrogen content cause impaired biogas production due to ammonia inhibition; (3) the combination of food waste and paper sludge gave higher methane yield than either of the substrates digested separately; (4) pure microalgae gave the highest methane yield. In conclusion, although pulp and paper mills are an almost untapped resource for biomethane production, their wastewater is a suitable feedstock for such a process. Furthermore, through co-digestion, the pulp and paper mill wastewater and mill sludges can aid biogas production from more nutrient-rich waste streams from other industries. Such co-digestion also enhances the soil improvement properties of the residue digestate.

Keywords: anaerobic, biogas, biomethane, paper, sludge, soil

Procedia PDF Downloads 250
5543 A Concept in Addressing the Singularity of the Emerging Universe

Authors: Mahmoud Reza Hosseini

Abstract:

The universe is in a continuous expansion process, resulting in the reduction of its density and temperature. Also, by extrapolating back from its current state, the universe at its early times has been studied known as the big bang theory. According to this theory, moments after creation, the universe was an extremely hot and dense environment. However, its rapid expansion due to nuclear fusion led to a reduction in its temperature and density. This is evidenced through the cosmic microwave background and the universe structure at a large scale. However, extrapolating back further from this early state reaches singularity which cannot be explained by modern physics and the big bang theory is no longer valid. In addition, one can expect a nonuniform energy distribution across the universe from a sudden expansion. However, highly accurate measurements reveal an equal temperature mapping across the universe which is contradictory to the big bang principles. To resolve this issue, it is believed that cosmic inflation occurred at the very early stages of the birth of the universe According to the cosmic inflation theory, the elements which formed the universe underwent a phase of exponential growth due to the existence of a large cosmological constant. The inflation phase allows the uniform distribution of energy so that an equal maximum temperature could be achieved across the early universe. Also, the evidence of quantum fluctuations of this stage provides a means for studying the types of imperfections the universe would begin with. Although well-established theories such as cosmic inflation and the big bang together provide a comprehensive picture of the early universe and how it evolved into its current state, they are unable to address the singularity paradox at the time of universe creation. Therefore, a practical model capable of describing how the universe was initiated is needed. This research series aims at addressing the singularity issue by introducing an energy conversion mechanism. This is accomplished by establishing a state of energy called a “neutral state”, with an energy level which is referred to as “base energy” capable of converting into other states. Although it follows the same principles, the unique quanta state of the base energy allows it to be distinguishable from other states and have a uniform distribution at the ground level. Although the concept of base energy can be utilized to address the singularity issue, to establish a complete picture, the origin of the base energy should be also identified. This matter is the subject of the first study in the series “A Conceptual Study for Investigating the Creation of Energy and Understanding the Properties of Nothing” which is discussed in detail. Therefore, the proposed concept in this research series provides a road map for enhancing our understating of the universe's creation from nothing and its evolution and discusses the possibility of base energy as one of the main building blocks of this universe.

Keywords: big bang, cosmic inflation, birth of universe, energy creation

Procedia PDF Downloads 79
5542 Efficient Model Selection in Linear and Non-Linear Quantile Regression by Cross-Validation

Authors: Yoonsuh Jung, Steven N. MacEachern

Abstract:

Check loss function is used to define quantile regression. In the prospect of cross validation, it is also employed as a validation function when underlying truth is unknown. However, our empirical study indicates that the validation with check loss often leads to choosing an over estimated fits. In this work, we suggest a modified or L2-adjusted check loss which rounds the sharp corner in the middle of check loss. It has a large effect of guarding against over fitted model in some extent. Through various simulation settings of linear and non-linear regressions, the improvement of check loss by L2 adjustment is empirically examined. This adjustment is devised to shrink to zero as sample size grows.

Keywords: cross-validation, model selection, quantile regression, tuning parameter selection

Procedia PDF Downloads 425
5541 1/Sigma Term Weighting Scheme for Sentiment Analysis

Authors: Hanan Alshaher, Jinsheng Xu

Abstract:

Large amounts of data on the web can provide valuable information. For example, product reviews help business owners measure customer satisfaction. Sentiment analysis classifies texts into two polarities: positive and negative. This paper examines movie reviews and tweets using a new term weighting scheme, called one-over-sigma (1/sigma), on benchmark datasets for sentiment classification. The proposed method aims to improve the performance of sentiment classification. The results show that 1/sigma is more accurate than the popular term weighting schemes. In order to verify if the entropy reflects the discriminating power of terms, we report a comparison of entropy values for different term weighting schemes.

Keywords: 1/sigma, natural language processing, sentiment analysis, term weighting scheme, text classification

Procedia PDF Downloads 197
5540 Streamlining the Fuzzy Front-End and Improving the Usability of the Tools Involved

Authors: Michael N. O'Sullivan, Con Sheahan

Abstract:

Researchers have spent decades developing tools and techniques to aid teams in the new product development (NPD) process. Despite this, it is evident that there is a huge gap between their academic prevalence and their industry adoption. For the fuzzy front-end, in particular, there is a wide range of tools to choose from, including the Kano Model, the House of Quality, and many others. In fact, there are so many tools that it can often be difficult for teams to know which ones to use and how they interact with one another. Moreover, while the benefits of using these tools are obvious to industrialists, they are rarely used as they carry a learning curve that is too steep and they become too complex to manage over time. In essence, it is commonly believed that they are simply not worth the effort required to learn and use them. This research explores a streamlined process for the fuzzy front-end, assembling the most effective tools and making them accessible to everyone. The process was developed iteratively over the course of 3 years, following over 80 final year NPD teams from engineering, design, technology, and construction as they carried a product from concept through to production specification. Questionnaires, focus groups, and observations were used to understand the usability issues with the tools involved, and a human-centred design approach was adopted to produce a solution to these issues. The solution takes the form of physical toolkit, similar to a board game, which allows the team to play through an example of a new product development in order to understand the process and the tools, before using it for their own product development efforts. A complimentary website is used to enhance the physical toolkit, and it provides more examples of the tools being used, as well as deeper discussions on each of the topics, allowing teams to adapt the process to their skills, preferences and product type. Teams found the solution very useful and intuitive and experienced significantly less confusion and mistakes with the process than teams who did not use it. Those with a design background found it especially useful for the engineering principles like Quality Function Deployment, while those with an engineering or technology background found it especially useful for design and customer requirements acquisition principles, like Voice of the Customer. Products developed using the toolkit are added to the website as more examples of how it can be used, creating a loop which helps future teams understand how the toolkit can be adapted to their project, whether it be a small consumer product or a large B2B service. The toolkit unlocks the potential of these beneficial tools to those in industry, both for large, experienced teams and for inexperienced start-ups. It allows users to assess the market potential of their product concept faster and more effectively, arriving at the product design stage with technical requirements prioritized according to their customers’ needs and wants.

Keywords: new product development, fuzzy front-end, usability, Kano model, quality function deployment, voice of customer

Procedia PDF Downloads 103
5539 Detect QOS Attacks Using Machine Learning Algorithm

Authors: Christodoulou Christos, Politis Anastasios

Abstract:

A large majority of users favoured to wireless LAN connection since it was so simple to use. A wireless network can be the target of numerous attacks. Class hijacking is a well-known attack that is fairly simple to execute and has significant repercussions on users. The statistical flow analysis based on machine learning (ML) techniques is a promising categorization methodology. In a given dataset, which in the context of this paper is a collection of components representing frames belonging to various flows, machine learning (ML) can offer a technique for identifying and characterizing structural patterns. It is possible to classify individual packets using these patterns. It is possible to identify fraudulent conduct, such as class hijacking, and take necessary action as a result. In this study, we explore a way to use machine learning approaches to thwart this attack.

Keywords: wireless lan, quality of service, machine learning, class hijacking, EDCA remapping

Procedia PDF Downloads 49
5538 Exploration of in-situ Product Extraction to Increase Triterpenoid Production in Saccharomyces Cerevisiae

Authors: Mariam Dianat Sabet Gilani, Lars M. Blank, Birgitta E. Ebert

Abstract:

Plant-derived lupane-type, pentacyclic triterpenoids are biologically active compounds that are highly interesting for applications in medical, pharmaceutical, and cosmetic industries. Due to the low abundance of these valuable compounds in their natural sources, and the environmentally harmful downstream process, alternative production methods, such as microbial cell factories, are investigated. Engineered Saccharomyces cerevisiae strains, harboring the heterologous genes for betulinic acid synthesis, can produce up to 2 g L-1 triterpenoids, showing high potential for large-scale production of triterpenoids. One limitation of the microbial synthesis is the intracellular product accumulation. It not only makes cell disruption a necessary step in the downstream processing but also limits productivity and product yield per cell. To overcome these restrictions, the aim of this study is to develop an in-situ extraction method, which extracts triterpenoids into a second organic phase. Such a continuous or sequential product removal from the biomass keeps the cells in an active state and enables extended production time or biomass recycling. After screening of twelve different solvents, selected based on product solubility, biocompatibility, as well as environmental and health impact, isopropyl myristate (IPM) was chosen as a suitable solvent for in-situ product removal from S. cerevisiae. Impedance-based single-cell analysis and off-gas measurement of carbon dioxide emission showed that cell viability and physiology were not affected by the presence of IPM. Initial experiments demonstrated that after the addition of 20 vol % IPM to cultures in the stationary phase, 40 % of the total produced triterpenoids were extracted from the cells into the organic phase. In future experiments, the application of IPM in a repeated batch process will be tested, where IPM is added at the end of each batch run to remove triterpenoids from the cells, allowing the same biocatalysts to be used in several sequential batch steps. Due to its high biocompatibility, the amount of IPM added to the culture can also be increased to more than 20 vol % to extract more than 40 % triterpenoids in the organic phase, allowing the cells to produce more triterpenoids. This highlights the potential for the development of a continuous large-scale process, which allows biocatalysts to produce intracellular products continuously without the necessity of cell disruption and without limitation of the cell capacity.

Keywords: betulinic acid, biocompatible solvent, in-situ extraction, isopropyl myristate, process development, secondary metabolites, triterpenoids, yeast

Procedia PDF Downloads 135
5537 Long-Term Variabilities and Tendencies in the Zonally Averaged TIMED-SABER Ozone and Temperature in the Middle Atmosphere over 10°N-15°N

Authors: Oindrila Nath, S. Sridharan

Abstract:

Long-term (2002-2012) temperature and ozone measurements by Sounding of Atmosphere by Broadband Emission Radiometry (SABER) instrument onboard Thermosphere, Ionosphere, Mesosphere Energetics and Dynamics (TIMED) satellite zonally averaged over 10°N-15°N are used to study their long-term changes and their responses to solar cycle, quasi-biennial oscillation and El Nino Southern Oscillation. The region is selected to provide more accurate long-term trends and variabilities, which were not possible earlier with lidar measurements over Gadanki (13.5°N, 79.2°E), which are limited to cloud-free nights, whereas continuous data sets of SABER temperature and ozone are available. Regression analysis of temperature shows a cooling trend of 0.5K/decade in the stratosphere and that of 3K/decade in the mesosphere. Ozone shows a statistically significant decreasing trend of 1.3 ppmv per decade in the mesosphere although there is a small positive trend in stratosphere at 25 km. Other than this no significant ozone trend is observed in stratosphere. Negative ozone-QBO response (0.02ppmv/QBO), positive ozone-solar cycle (0.91ppmv/100SFU) and negative response to ENSO (0.51ppmv/SOI) have been found more in mesosphere whereas positive ozone response to ENSO (0.23ppmv/SOI) is pronounced in stratosphere (20-30 km). The temperature response to solar cycle is more positive (3.74K/100SFU) in the upper mesosphere and its response to ENSO is negative around 80 km and positive around 90-100 km and its response to QBO is insignificant at most of the heights. Composite monthly mean of ozone volume mixing ratio shows maximum values during pre-monsoon and post-monsoon season in middle stratosphere (25-30 km) and in upper mesosphere (85-95 km) around 10 ppmv. Composite monthly mean of temperature shows semi-annual variation with large values (~250-260 K) in equinox months and less values in solstice months in upper stratosphere and lower mesosphere (40-55 km) whereas the SAO becomes weaker above 55 km. The semi-annual variation again appears at 80-90 km, with large values in spring equinox and winter months. In the upper mesosphere (90-100 km), less temperature (~170-190 K) prevails in all the months except during September, when the temperature is slightly more. The height profiles of amplitudes of semi-annual and annual oscillations in ozone show maximum values of 6 ppmv and 2.5 ppmv respectively in upper mesosphere (80-100 km), whereas SAO and AO in temperature show maximum values of 5.8 K and 4.6 K in lower and middle mesosphere around 60-85 km. The phase profiles of both SAO and AO show downward progressions. These results are being compared with long-term lidar temperature measurements over Gadanki (13.5°N, 79.2°E) and the results obtained will be presented during the meeting.

Keywords: trends, QBO, solar cycle, ENSO, ozone, temperature

Procedia PDF Downloads 402
5536 Structural and Biochemical Characterization of Red and Green Emitting Luciferase Enzymes

Authors: Wael M. Rabeh, Cesar Carrasco-Lopez, Juliana C. Ferreira, Pance Naumov

Abstract:

Bioluminescence, the emission of light from a biological process, is found in various living organisms including bacteria, fireflies, beetles, fungus and different marine organisms. Luciferase is an enzyme that catalyzes a two steps oxidation of luciferin in the presence of Mg2+ and ATP to produce oxyluciferin and releases energy in the form of light. The luciferase assay is used in biological research and clinical applications for in vivo imaging, cell proliferation, and protein folding and secretion analysis. The luciferase enzyme consists of two domains, a large N-terminal domain (1-436 residues) that is connected to a small C-terminal domain (440-544) by a flexible loop that functions as a hinge for opening and closing the active site. The two domains are separated by a large cleft housing the active site that closes after binding the substrates, luciferin and ATP. Even though all insect luciferases catalyze the same chemical reaction and share 50% to 90% sequence homology and high structural similarity, they emit light of different colors from green at 560nm to red at 640 nm. Currently, the majority of the structural and biochemical studies have been conducted on green-emitting firefly luciferases. To address the color emission mechanism, we expressed and purified two luciferase enzymes with blue-shifted green and red emission from indigenous Brazilian species Amydetes fanestratus and Phrixothrix, respectively. The two enzymes naturally emit light of different colors and they are an excellent system to study the color-emission mechanism of luciferases, as the current proposed mechanisms are based on mutagenesis studies. Using a vapor-diffusion method and a high-throughput approach, we crystallized and solved the crystal structure of both enzymes, at 1.7 Å and 3.1 Å resolution respectively, using X-ray crystallography. The free enzyme adopted two open conformations in the crystallographic unit cell that are different from the previously characterized firefly luciferase. The blue-shifted green luciferase crystalized as a monomer similar to other luciferases reported in literature, while the red luciferases crystalized as an octamer and was also purified as an octomer in solution. The octomer conformation is the first of its kind for any insect’s luciferase, which might be relate to the red color emission. Structurally designed mutations confirmed the importance of the transition between the open and close conformations in the fine-tuning of the color and the characterization of other interesting mutants is underway.

Keywords: bioluminescence, enzymology, structural biology, x-ray crystallography

Procedia PDF Downloads 317
5535 Audio-Visual Entrainment and Acupressure Therapy for Insomnia

Authors: Mariya Yeldhos, G. Hema, Sowmya Narayanan, L. Dhiviyalakshmi

Abstract:

Insomnia is one of the most prevalent psychological disorders worldwide. Some of the deficiencies of the current treatments of insomnia are: side effects in the case of sleeping pills and high costs in the case of psychotherapeutic treatment. In this paper, we propose a device which provides a combination of audio visual entrainment and acupressure based compression therapy for insomnia. This device provides drug-free treatment of insomnia through a user friendly and portable device that enables relaxation of brain and muscles, with certain advantages such as low cost, and wide accessibility to a large number of people. Tools adapted towards the treatment of insomnia: -Audio -Continuous exposure to binaural beats of a particular frequency of audible range -Visual -Flash of LED light -Acupressure points -GB-20 -GV-16 -B-10

Keywords: insomnia, acupressure, entrainment, audio-visual entrainment

Procedia PDF Downloads 422
5534 Graphene Materials for Efficient Hybrid Solar Cells: A Spectroscopic Investigation

Authors: Mohammed Khenfouch, Fokotsa V. Molefe, Bakang M. Mothudi

Abstract:

Nowadays, graphene and its composites are universally known as promising materials. They show their potential in a large field of applications including photovoltaics. This study reports on the role of nanohybrids and nanosystems known as strong light harvesters in the efficiency of graphene hybrid solar cells. Our system included Graphene/ZnO/Porphyrin/P3HT layers. Moreover, the physical properties including surface/interface, optical and vibrational properties were also studied. Our investigations confirmed the interaction between the different components as well as the sensitivity of their photonics to the synthesis conditions. Remarkable energy and charge transfer were detected and deeply investigated. Hence, the optimization of the conditions will lead to the fabrication of higher conversion efficiency in graphene solar cells.

Keywords: graphene, optoelectronics, nanohybrids, solar cells

Procedia PDF Downloads 160
5533 Behaviours of Energy Spectrum at Low Reynolds Numbers in Grid Turbulence

Authors: Md Kamruzzaman, Lyazid Djenidi, R. A. Antonia

Abstract:

This paper reports an experimental investigation of the energy spectrum of turbulent velocity fields at low Reynolds numbers ( Rλ ) in grid turbulence. Hot wire measurements are carried out in grid turbulence with subjected to a 1.36:1 contraction of the wind tunnel. Three different grids are used: (i) large square perforated grid (mesh size 43.75 mm), (ii) small square perforated grid (mesh size 14 and (iii) woven mesh grid (mesh size 5mm). The results indicate that the energy spectrum at small Rλ does not follow Kolmogorov’s universal scaling. It is further found that the critical Reynolds number,Rλ,ϲ below which the scaling breaks down is around 25.

Keywords: energy spectrum, Taylor microscale, Reynolds number, turbulent kinetic energy, decay exponent

Procedia PDF Downloads 281
5532 Nonhomogeneous Linear Second Order Differential Equations and Resonance through Geogebra Program

Authors: F. Maass, P. Martin, J. Olivares

Abstract:

The aim of this work is the application of the program GeoGebra in teaching the study of nonhomogeneous linear second order differential equations with constant coefficients. Different kind of functions or forces will be considered in the right hand side of the differential equations, in particular, the emphasis will be placed in the case of trigonometrical functions producing the resonance phenomena. In order to obtain this, the frequencies of the trigonometrical functions will be changed. Once the resonances appear, these have to be correlationated with the roots of the second order algebraic equation determined by the coefficients of the differential equation. In this way, the physics and engineering students will understand resonance effects and its consequences in the simplest way. A large variety of examples will be shown, using different kind of functions for the nonhomogeneous part of the differential equations.

Keywords: education, geogebra, ordinary differential equations, resonance

Procedia PDF Downloads 235
5531 Characteristics of the Rocks Glacier Deposits in the Southern Carpathians, Romania

Authors: Petru Urdea

Abstract:

As a distinct part of the mountain system, the rock glacier system is a particularly periglacial debris system. Being an open system, it works in a manner of interconnection with others subsystems like glacial, cliffs, rocky slopes sand talus slope subsystems, which are sources of sediments. One characteristic is that for long periods of time it is like a storage unit for debris, and ice, and temporary for snow and water. In the Southern Carpathians 306 rock glaciers were identified. The vast majority of these rock glaciers, are talus rock glaciers, 74%, and 26%, are debris rock glaciers. In the area occupied by granites and granodiorites are present, 49% of all the rock glaciers, representing 61% of the area occupied by Southern Carpathians rock glaciers. This lithological dependence also leaves its mark on the specifics of the deposits, everything bearing the imprint of the particular way the rocks respond to the physical weathering processes, all in a periglacial regime. If in the domain of granites and granodiorites the blocks are large, - of metric order, even 10 m3 - , in the domain of the metamorphic rocks only gneisses can cut similar sizes. Amphibolites, amphibolitic schists, micaschists, sericite-chlorite schists and phyllites crop out in much smaller blocks, of decimetric order, mostly in the form of slabs. In the case of rock glaciers made up of large blocks, with a strcture of open-works type, the density and volume of voids between the blocks is greater, the smaller debris generating more compact structures with fewer voids. All these influences the thermal regime, associated with a certain type of air circulation during the seasons and the emergence of permafrost formation conditions. The rock glaciers are fed by rock falls, rock avalanches, debris flows, avalanches, so that the structure is heterogeneous, which is also reflected in the detailed topography of the rock glaciers. This heterogeneity is also influenced by the spatial assembly of the rock bodies in the supply area and, an element that cannot be omitted, the behavior of the rocks during periglacial weathering. The production of small gelifracts determines the filling of voids and the appearance of more compact structures, with effects on the creep process. In general, surface deposits are coarser, those in depth are finer, their characteristics being detectable by applying geophysical methods. The electrical tomography (ERT) and georadar (GPR) investigations carried out in the Făgăraş Mountains, Retezat and the Parâng Mountains, each with a different lithological specificity, allowed the identification of some differentiations, including the presence of permafrost bodies.

Keywords: rock glaciers deposits, structure, lithology, permafrost, Southern Carpathians, Romania

Procedia PDF Downloads 11
5530 Designing a Robust Controller for a 6 Linkage Robot

Authors: G. Khamooshian

Abstract:

One of the main points of application of the mechanisms of the series and parallel is the subject of managing them. The control of this mechanism and similar mechanisms is one that has always been the intention of the scholars. On the other hand, modeling the behavior of the system is difficult due to the large number of its parameters, and it leads to complex equations that are difficult to solve and eventually difficult to control. In this paper, a six-linkage robot has been presented that could be used in different areas such as medical robots. Using these robots needs a robust control. In this paper, the system equations are first found, and then the system conversion function is written. A new controller has been designed for this robot which could be used in other parallel robots and could be very useful. Parallel robots are so important in robotics because of their stability, so methods for control of them are important and the robust controller, especially in parallel robots, makes a sense.

Keywords: 3-RRS, 6 linkage, parallel robot, control

Procedia PDF Downloads 146
5529 Limit State of Heterogeneous Smart Structures under Unknown Cyclic Loading

Authors: M. Chen, S-Q. Zhang, X. Wang, D. Tate

Abstract:

This paper presents a numerical solution, namely limit and shakedown analysis, to predict the safety state of smart structures made of heterogeneous materials under unknown cyclic loadings, for instance, the flexure hinge in the micro-positioning stage driven by piezoelectric actuator. In combination of homogenization theory and finite-element method (FEM), the safety evaluation problem is converted to a large-scale nonlinear optimization programming for an acceptable bounded loading as the design reference. Furthermore, a general numerical scheme integrated with the FEM and interior-point-algorithm based optimization tool is developed, which makes the practical application possible.

Keywords: limit state, shakedown analysis, homogenization, heterogeneous structure

Procedia PDF Downloads 328
5528 Effects of Exhibition Firms' Resource Investment Behavior on Their Booth Staffs' Role Perceptions, Goal Acceptance and Work Effort during the Exhibition Period

Authors: Po-Chien Li

Abstract:

Despite the extant literature has hosted a wide-range of knowledge about trade shows, this knowledge base deserves to be further expanded and extended because there exist many unclear issues and overlooked topics. One area that needs much research attention is regarding the behavior and performance of booth workers at the exhibition site. Booth staffs play many key roles in interacting with booth visitors. Their exhibiting-related attitudes and motivations might have significant consequences on a firm’s exhibition results. However, to date, little research, if any, has studied how booth workers are affected and behave in the context of trade fair. The primary purpose of the current study is to develop and test a research model, derived from role theory and resource-based viewpoint, that depicts the effects of a firm’s pre-exhibition resource investment behavior on booth staff’s role perceptions and work behavior during the exhibition period. The author collects data with two survey questionnaires at two trade shows in 2016. One questionnaire is given to the booth head of an exhibiting company, asking about the firm’s resource commitment behavior prior to the exhibition period. In contrast, another questionnaire is provided for a booth worker of the same firm, requesting the individual staff to report his/her own role perceptions, degree of exhibition goal acceptance, and level of work effort during the exhibition period. The study has utilized the following analytic methods, including descriptive statistics, exploratory factor analysis, reliability analysis, and regression analysis. The results of a set of regression analyses show that a firm’s pre-exhibition resource investment behavior has significant effects on a booth staff’s exhibiting perceptions and attitudes. Specifically, an exhibitor’s resource investment behavior has impacts on the factors of booth staff’s role clarity and role conflict. In addition, a booth worker’s role clarity is related to the degree of exhibition goal acceptance, but his/her role conflict is not. Finally, a booth worker’s exhibiting effort is significantly related to the individual’s role clarity, role conflict and goal acceptance. In general, the major contribution of the current research is that it offers insight into and early evidence on the links between an exhibiting firm’s resource commitment behavior and the work perceptions and attitudes of booth staffs during the exhibition period. The current research’s results can benefit the extant literature of exhibition marketing.

Keywords: exhibition resource investment, role perceptions, goal acceptance, work effort

Procedia PDF Downloads 201
5527 The Use of Social Media in a UK School of Pharmacy to Increase Student Engagement and Sense of Belonging

Authors: Samantha J. Hall, Luke Taylor, Kenneth I. Cumming, Jakki Bardsley, Scott S. P. Wildman

Abstract:

Medway School of Pharmacy – a joint collaboration between the University of Kent and the University of Greenwich – is a large school of pharmacy in the United Kingdom. The school primarily delivers the accredited Master or Pharmacy (MPharm) degree programme. Reportedly, some students may feel isolated from the larger student body that extends across four separate campuses, where a diverse range of academic subjects is delivered. In addition, student engagement has been noted as being limited in some areas, as evidenced in some cases by poor attendance at some lectures. In January 2015, the University of Kent launched a new initiative dedicated to Equality, Diversity and Inclusivity (EDI). As part of this project, Medway School of Pharmacy employed ‘Student Success Project Officers’ in order to analyse past and present school data. As a result, initiatives have been implemented to i) negate disparities in attainment and ii) increase engagement, particularly for Black, Asian and Minority Ethnic (BAME) students which make up for more than 80% of the pharmacy student cohort. Social media platforms are prevalent, with global statistics suggesting that they are most commonly used by females between the ages of 16-34. Student focus groups held throughout the academic year brought to light the school’s need to use social media much more actively. Prior to the EDI initiative, social media usage for Medway School of Pharmacy was scarce. Platforms including: Facebook, Twitter, Instagram, YouTube, The Student Room and University Blogs were either introduced or rejuvenated. This action was taken with the primary aim of increasing student engagement. By using a number of varied social media platforms, the university is able to capture a large range of students by appealing to different interests. Social media is being used to disseminate important information, promote equality and diversity, recognise and celebrate student success and also to allow students to explore the student life outside of Medway School of Pharmacy. Early data suggests an increase in lecture attendance, as well as greater evidence of student engagement highlighted by recent focus group discussions. In addition, students have communicated that active social media accounts were imperative when choosing universities for 2015/16. It allows students to understand more about the University and community prior to beginning their studies. By having a lively presence on social media, the university can use a multi-faceted approach to succeed in early engagement, as well as fostering the long term engagement of continuing students.

Keywords: engagement, social media, pharmacy, community

Procedia PDF Downloads 317