Search results for: feature combination
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4570

Search results for: feature combination

3310 Gaussian Operations with a Single Trapped Ion

Authors: Bruna G. M. Araújo, Pedro M. M. Q. Cruz

Abstract:

In this letter, we review the literature of the major concepts that govern Gaussian quantum information. As we work with quantum information and computation with continuous variables, Gaussian states are needed to better describe these systems. Analyzing a single ion locked in a Paul trap we use the interaction picture to obtain a toolbox of Gaussian operations with the ion-laser interaction Hamiltionian. This is achieved exciting the ion through the combination of two lasers of distinct frequencies corresponding to different sidebands of the external degrees of freedom. First we study the case of a trap with 1 mode and then the case with 2 modes. In this way, we achieve different continuous variables gates just by changing the external degrees of freedom of the trap and combining the Hamiltonians of blue and red sidebands.

Keywords: Paul trap, ion-laser interaction, Gaussian operations

Procedia PDF Downloads 686
3309 Mechanical Properties of Kenaf Reinforced Composite with Different Fiber Orientation

Authors: Y. C. Ching, K. H. Chong

Abstract:

The increasing of environmental awareness has led to grow interest in the expansion of materials with eco-friendly attributes. In this study, a 3 ply sandwich layer of kenaf fiber reinforced unsaturated polyester with various fiber orientations was developed. The effect of the fiber orientation on mechanical and thermal stability properties of polyester was studied. Unsaturated polyester as a face sheets and kenaf fibers as a core was fabricated with combination of hand lay-up process and cold compression method. Tested result parameters like tensile, flexural, impact strength, melting point, and crystallization point were compared and recorded based on different fiber orientation. The failure mechanism and property changes associated with directional change of fiber to polyester composite were discussed.

Keywords: kenaf fiber, polyester, tensile, thermal stability

Procedia PDF Downloads 359
3308 Detection and Classification of Rubber Tree Leaf Diseases Using Machine Learning

Authors: Kavyadevi N., Kaviya G., Gowsalya P., Janani M., Mohanraj S.

Abstract:

Hevea brasiliensis, also known as the rubber tree, is one of the foremost assets of crops in the world. One of the most significant advantages of the Rubber Plant in terms of air oxygenation is its capacity to reduce the likelihood of an individual developing respiratory allergies like asthma. To construct such a system that can properly identify crop diseases and pests and then create a database of insecticides for each pest and disease, we must first give treatment for the illness that has been detected. We shall primarily examine three major leaf diseases since they are economically deficient in this article, which is Bird's eye spot, algal spot and powdery mildew. And the recommended work focuses on disease identification on rubber tree leaves. It will be accomplished by employing one of the superior algorithms. Input, Preprocessing, Image Segmentation, Extraction Feature, and Classification will be followed by the processing technique. We will use time-consuming procedures that they use to detect the sickness. As a consequence, the main ailments, underlying causes, and signs and symptoms of diseases that harm the rubber tree are covered in this study.

Keywords: image processing, python, convolution neural network (CNN), machine learning

Procedia PDF Downloads 76
3307 Sentiment Classification Using Enhanced Contextual Valence Shifters

Authors: Vo Ngoc Phu, Phan Thi Tuoi

Abstract:

We have explored different methods of improving the accuracy of sentiment classification. The sentiment orientation of a document can be positive (+), negative (-), or neutral (0). We combine five dictionaries from [2, 3, 4, 5, 6] into the new one with 21137 entries. The new dictionary has many verbs, adverbs, phrases and idioms, that are not in five ones before. The paper shows that our proposed method based on the combination of Term-Counting method and Enhanced Contextual Valence Shifters method has improved the accuracy of sentiment classification. The combined method has accuracy 68.984% on the testing dataset, and 69.224% on the training dataset. All of these methods are implemented to classify the reviews based on our new dictionary and the Internet Movie data set.

Keywords: sentiment classification, sentiment orientation, valence shifters, contextual, valence shifters, term counting

Procedia PDF Downloads 504
3306 Glycoside Hydrolase Clan GH-A-like Structure Complete Evaluation

Authors: Narin Salehiyan

Abstract:

The three iodothyronine selenodeiodinases catalyze the start and end of thyroid hormone impacts in vertebrates. Auxiliary examinations of these proteins have been prevented by their indispensably film nature and the wasteful eukaryotic-specific pathway for selenoprotein blend. Hydrophobic cluster examination utilized in combination with Position-specific Iterated Impact uncovers that their extramembrane parcel has a place to the thioredoxin-fold superfamily for which test structure data exists. Besides, a expansive deiodinase locale imbedded within the thioredoxin overlay offers solid similitudes with the dynamic location of iduronidase, a part of the clan GH-A-fold of glycoside hydrolases. This show can clarify a number of comes about from past mutagenesis examinations and grants unused irrefutable experiences into the auxiliary and utilitarian properties of these proteins.

Keywords: glycoside, hydrolase, GH-A-like structure, catalyze

Procedia PDF Downloads 70
3305 A New Computational Method for the Solution of Nonlinear Burgers' Equation Arising in Longitudinal Dispersion Phenomena in Fluid Flow through Porous Media

Authors: Olayiwola Moruf Oyedunsi

Abstract:

This paper discusses the Modified Variational Iteration Method (MVIM) for the solution of nonlinear Burgers’ equation arising in longitudinal dispersion phenomena in fluid flow through porous media. The method is an elegant combination of Taylor’s series and the variational iteration method (VIM). Using Maple 18 for implementation, it is observed that the procedure provides rapidly convergent approximation with less computational efforts. The result shows that the concentration C(x,t) of the contaminated water decreases as distance x increases for the given time t.

Keywords: modified variational iteration method, Burger’s equation, porous media, partial differential equation

Procedia PDF Downloads 321
3304 Simulation of Remove the Fouling on the in vivo By Using MHD

Authors: Farhad Aalizadeh, Ali Moosavi

Abstract:

When a blood vessel is injured, the cells of your blood bond together to form a blood clot. The blood clot helps you stop bleeding. Blood clots are made of a combination of blood cells, platelets(small sticky cells that speed up the clot-making process), and fibrin (protein that forms a thread-like mesh to trap cells). Doctors call this kind of blood clot a “thrombus.”We study the effects of different parameters on the deposition of Nanoparticles on the surface of a bump in the blood vessels by the magnetic field. The Maxwell and the flow equations are solved for this purpose. It is assumed that the blood is non-Newtonian and the number of particles has been considered enough to rely on the results statistically. Using MHD and its property it is possible to control the flow velocity, remove the fouling on the walls and return the system to its original form.

Keywords: MHD, fouling, in-vivo, blood clots, simulation

Procedia PDF Downloads 469
3303 A Learning Process for Aesthetics of Language in Thai Poetry for High School Teachers

Authors: Jiraporn Adchariyaprasit

Abstract:

The aesthetics of language in Thai poetry are emerged from the combination of sounds and meanings. The appreciation of such beauty can be achieved by means of education, acquisition of knowledge, and training. This research aims to study the learning process of aesthetics of language in Thai poetry for high school teachers in Bangkok and nearby provinces. There are 10 samples selected by purposive sampling for in-depth interviews. According to the research, there are four patterns in the learning process of aesthetics of language in Thai poetry which are 1) the study of characteristics and patterns of poetry, 2) the training of poetic reading, 3) the study of social and cultural contexts of poetry’s creation, and 4) the study of other sciences related to poetry such as linguistics, traditional dance, and so on.

Keywords: aesthetics, poetry, Thai poetry, poetry learning

Procedia PDF Downloads 436
3302 A Clustering-Based Approach for Weblog Data Cleaning

Authors: Amine Ganibardi, Cherif Arab Ali

Abstract:

This paper addresses the data cleaning issue as a part of web usage data preprocessing within the scope of Web Usage Mining. Weblog data recorded by web servers within log files reflect usage activity, i.e., End-users’ clicks and underlying user-agents’ hits. As Web Usage Mining is interested in End-users’ behavior, user-agents’ hits are referred to as noise to be cleaned-off before mining. Filtering hits from clicks is not trivial for two reasons, i.e., a server records requests interlaced in sequential order regardless of their source or type, website resources may be set up as requestable interchangeably by end-users and user-agents. The current methods are content-centric based on filtering heuristics of relevant/irrelevant items in terms of some cleaning attributes, i.e., website’s resources filetype extensions, website’s resources pointed by hyperlinks/URIs, http methods, user-agents, etc. These methods need exhaustive extra-weblog data and prior knowledge on the relevant and/or irrelevant items to be assumed as clicks or hits within the filtering heuristics. Such methods are not appropriate for dynamic/responsive Web for three reasons, i.e., resources may be set up to as clickable by end-users regardless of their type, website’s resources are indexed by frame names without filetype extensions, web contents are generated and cancelled differently from an end-user to another. In order to overcome these constraints, a clustering-based cleaning method centered on the logging structure is proposed. This method focuses on the statistical properties of the logging structure at the requested and referring resources attributes levels. It is insensitive to logging content and does not need extra-weblog data. The used statistical property takes on the structure of the generated logging feature by webpage requests in terms of clicks and hits. Since a webpage consists of its single URI and several components, these feature results in a single click to multiple hits ratio in terms of the requested and referring resources. Thus, the clustering-based method is meant to identify two clusters based on the application of the appropriate distance to the frequency matrix of the requested and referring resources levels. As the ratio clicks to hits is single to multiple, the clicks’ cluster is the smallest one in requests number. Hierarchical Agglomerative Clustering based on a pairwise distance (Gower) and average linkage has been applied to four logfiles of dynamic/responsive websites whose click to hits ratio range from 1/2 to 1/15. The optimal clustering set on the basis of average linkage and maximum inter-cluster inertia results always in two clusters. The evaluation of the smallest cluster referred to as clicks cluster under the terms of confusion matrix indicators results in 97% of true positive rate. The content-centric cleaning methods, i.e., conventional and advanced cleaning, resulted in a lower rate 91%. Thus, the proposed clustering-based cleaning outperforms the content-centric methods within dynamic and responsive web design without the need of any extra-weblog. Such an improvement in cleaning quality is likely to refine dependent analysis.

Keywords: clustering approach, data cleaning, data preprocessing, weblog data, web usage data

Procedia PDF Downloads 170
3301 Revolutionizing Traditional Farming Using Big Data/Cloud Computing: A Review on Vertical Farming

Authors: Milind Chaudhari, Suhail Balasinor

Abstract:

Due to massive deforestation and an ever-increasing population, the organic content of the soil is depleting at a much faster rate. Due to this, there is a big chance that the entire food production in the world will drop by 40% in the next two decades. Vertical farming can help in aiding food production by leveraging big data and cloud computing to ensure plants are grown naturally by providing the optimum nutrients sunlight by analyzing millions of data points. This paper outlines the most important parameters in vertical farming and how a combination of big data and AI helps in calculating and analyzing these millions of data points. Finally, the paper outlines how different organizations are controlling the indoor environment by leveraging big data in enhancing food quantity and quality.

Keywords: big data, IoT, vertical farming, indoor farming

Procedia PDF Downloads 175
3300 Light-Controlled Gene Expression in Yeast

Authors: Peter. M. Kusen, Georg Wandrey, Christopher Probst, Dietrich Kohlheyer, Jochen Buchs, Jorg Pietruszkau

Abstract:

Light as a stimulus provides the capability to develop regulation techniques for customizable gene expression. A great advantage is the extremely flexible and accurate dosing that can be performed in a non invasive and sterile manner even for high throughput technologies. Therefore, light regulation in a multiwell microbioreactor system was realized providing the opportunity to control gene expression with outstanding complexity. A light-regulated gene expression system in Saccharomyces cerevisiae was designed applying the strategy of caged compounds. These compounds are photo-labile protected and therefore biologically inactive regulator molecules which can be reactivated by irradiation with certain light conditions. The “caging” of a repressor molecule which is consumed after deprotection was essential to create a flexible expression system. Thereby, gene expression could be temporally repressed by irradiation and subsequent release of the active repressor molecule. Afterwards, the repressor molecule is consumed by the yeast cells leading to reactivation of gene expression. A yeast strain harboring a construct with the corresponding repressible promoter in combination with a fluorescent marker protein was applied in a Photo-BioLector platform which allows individual irradiation as well as online fluorescence and growth detection. This device was used to precisely control the repression duration by adjusting the amount of released repressor via different irradiation times. With the presented screening platform the regulation of complex expression procedures was achieved by combination of several repression/derepression intervals. In particular, a stepwise increase of temporally-constant expression levels was demonstrated which could be used to study concentration dependent effects on cell functions. Also linear expression rates with variable slopes could be shown representing a possible solution for challenging protein productions, whereby excessive production rates lead to misfolding or intoxication. Finally, the very flexible regulation enabled accurate control over the expression induction, although we used a repressible promoter. Summing up, the continuous online regulation of gene expression has the potential to synchronize gene expression levels to optimize metabolic flux, artificial enzyme cascades, growth rates for co cultivations and many other applications addicted to complex expression regulation. The developed light-regulated expression platform represents an innovative screening approach to find optimization potential for production processes.

Keywords: caged-compounds, gene expression regulation, optogenetics, photo-labile protecting group

Procedia PDF Downloads 326
3299 Customized Cow’s Urine Battery Using MnO2 Depolarizer

Authors: Raj Kumar Rajak, Bharat Mishra

Abstract:

Bio-battery represents an entirely new long term, reasonable, reachable and ecofriendly approach to production of sustainable energy. Types of batteries have been developed using MnO2 in various ways. MnO2 is suitable with physical, chemical, electrochemical, and catalytic properties, serving as an effective cathodic depolarizer and may be considered as being the life blood of the battery systems. In the present experimental work, we have studied the effect of generation of power by bio-battery using different concentrations of MnO2. The tests show that it is possible to generate electricity using cow’s urine as an electrolyte. After ascertaining the optimum concentration of MnO2, various battery parameters and performance indicates that cow urine solely produces power of 695 mW, while a combination with MnO2 (40%) enhances power of bio-battery, i.e. 1377 mW. On adding more and more MnO2 to the electrolyte, the power suppressed because inflation of internal resistance. The analysis of the data produced from experiment shows that MnO2 is quite suitable to energize the bio-battery.

Keywords: bio-batteries, cow’s urine, manganese dioxide, non-conventional

Procedia PDF Downloads 261
3298 Parkinson's Disease Gene Identification Using Physicochemical Properties of Amino Acids

Authors: Priya Arora, Ashutosh Mishra

Abstract:

Gene identification, towards the pursuit of mutated genes, leading to Parkinson’s disease, puts forward a challenge towards proactive cure of the disorder itself. Computational analysis is an effective technique for exploring genes in the form of protein sequences, as the theoretical and manual analysis is infeasible. The limitations and effectiveness of a particular computational method are entirely dependent on the previous data that is available for disease identification. The article presents a sequence-based classification method for the identification of genes responsible for Parkinson’s disease. During the initiation phase, the physicochemical properties of amino acids transform protein sequences into a feature vector. The second phase of the method employs Jaccard distances to select negative genes from the candidate population. The third phase involves artificial neural networks for making final predictions. The proposed approach is compared with the state of art methods on the basis of F-measure. The results confirm and estimate the efficiency of the method.

Keywords: disease gene identification, Parkinson’s disease, physicochemical properties of amino acid, protein sequences

Procedia PDF Downloads 140
3297 A Geospatial Consumer Marketing Campaign Optimization Strategy: Case of Fuzzy Approach in Nigeria Mobile Market

Authors: Adeolu O. Dairo

Abstract:

Getting the consumer marketing strategy right is a crucial and complex task for firms with a large customer base such as mobile operators in a competitive mobile market. While empirical studies have made efforts to identify key constructs, no geospatial model has been developed to comprehensively assess the viability and interdependency of ground realities regarding the customer, competition, channel and the network quality of mobile operators. With this research, a geo-analytic framework is proposed for strategy formulation and allocation for mobile operators. Firstly, a fuzzy analytic network using a self-organizing feature map clustering technique based on inputs from managers and literature, which depicts the interrelationships amongst ground realities is developed. The model is tested with a mobile operator in the Nigeria mobile market. As a result, a customer-centric geospatial and visualization solution is developed. This provides a consolidated and integrated insight that serves as a transparent, logical and practical guide for strategic, tactical and operational decision making.

Keywords: geospatial, geo-analytics, self-organizing map, customer-centric

Procedia PDF Downloads 183
3296 Selection of Rayleigh Damping Coefficients for Seismic Response Analysis of Soil Layers

Authors: Huai-Feng Wang, Meng-Lin Lou, Ru-Lin Zhang

Abstract:

One good analysis method in seismic response analysis is direct time integration, which widely adopts Rayleigh damping. An approach is presented for selection of Rayleigh damping coefficients to be used in seismic analyses to produce a response that is consistent with Modal damping response. In the presented approach, the expression of the error of peak response, acquired through complete quadratic combination method, and Rayleigh damping coefficients was set up and then the coefficients were produced by minimizing the error. Two finite element modes of soil layers, excited by 28 seismic waves, were used to demonstrate the feasibility and validity.

Keywords: Rayleigh damping, modal damping, damping coefficients, seismic response analysis

Procedia PDF Downloads 438
3295 Predicting the Lifetime of Weathered Polyolefins by Relating Mechanics to Microstructure

Authors: Marta Chiapasco, Alexandra Porter, Finn Giuliani

Abstract:

Designing polymers with a specific microstructure can affect how the polymer degrades once released in the environment. Not only the amount but also the distribution of different phases determines a polymers’ degradability. The following research investigates the use of a combination of spectroscopy analysis and thermal analysis to study changes of polymers’ amorphous and crystalline phases during degradation, comparing different microstructures of polypropylene and polyethylene. The use of nanoindentation helps study how degradation proceeds across a material by looking at changes in phases, while bulk tensile test describes when the material fails. The first results demonstrate that different microstructures have different degrading rates, with homopolymer having a linear and faster degradation compared to copolymers. The goal is to create materials that degrade at faster rates without releasing microplastics into the environment.

Keywords: degradation, microstructure, nanoindentation, Raman spectroscopy

Procedia PDF Downloads 156
3294 A Framework for Automated Nuclear Waste Classification

Authors: Seonaid Hume, Gordon Dobie, Graeme West

Abstract:

Detecting and localizing radioactive sources is a necessity for safe and secure decommissioning of nuclear facilities. An important aspect for the management of the sort-and-segregation process is establishing the spatial distributions and quantities of the waste radionuclides, their type, corresponding activity, and ultimately classification for disposal. The data received from surveys directly informs decommissioning plans, on-site incident management strategies, the approach needed for a new cell, as well as protecting the workforce and the public. Manual classification of nuclear waste from a nuclear cell is time-consuming, expensive, and requires significant expertise to make the classification judgment call. Also, in-cell decommissioning is still in its relative infancy, and few techniques are well-developed. As with any repetitive and routine tasks, there is the opportunity to improve the task of classifying nuclear waste using autonomous systems. Hence, this paper proposes a new framework for the automatic classification of nuclear waste. This framework consists of five main stages; 3D spatial mapping and object detection, object classification, radiological mapping, source localisation based on gathered evidence and finally, waste classification. The first stage of the framework, 3D visual mapping, involves object detection from point cloud data. A review of related applications in other industries is provided, and recommendations for approaches for waste classification are made. Object detection focusses initially on cylindrical objects since pipework is significant in nuclear cells and indeed any industrial site. The approach can be extended to other commonly occurring primitives such as spheres and cubes. This is in preparation of stage two, characterizing the point cloud data and estimating the dimensions, material, degradation, and mass of the objects detected in order to feature match them to an inventory of possible items found in that nuclear cell. Many items in nuclear cells are one-offs, have limited or poor drawings available, or have been modified since installation, and have complex interiors, which often and inadvertently pose difficulties when accessing certain zones and identifying waste remotely. Hence, this may require expert input to feature match objects. The third stage, radiological mapping, is similar in order to facilitate the characterization of the nuclear cell in terms of radiation fields, including the type of radiation, activity, and location within the nuclear cell. The fourth stage of the framework takes the visual map for stage 1, the object characterization from stage 2, and radiation map from stage 3 and fuses them together, providing a more detailed scene of the nuclear cell by identifying the location of radioactive materials in three dimensions. The last stage involves combining the evidence from the fused data sets to reveal the classification of the waste in Bq/kg, thus enabling better decision making and monitoring for in-cell decommissioning. The presentation of the framework is supported by representative case study data drawn from an application in decommissioning from a UK nuclear facility. This framework utilises recent advancements of the detection and mapping capabilities of complex radiation fields in three dimensions to make the process of classifying nuclear waste faster, more reliable, cost-effective and safer.

Keywords: nuclear decommissioning, radiation detection, object detection, waste classification

Procedia PDF Downloads 200
3293 A Hybrid Fuzzy Clustering Approach for Fertile and Unfertile Analysis

Authors: Shima Soltanzadeh, Mohammad Hosain Fazel Zarandi, Mojtaba Barzegar Astanjin

Abstract:

Diagnosis of male infertility by the laboratory tests is expensive and, sometimes it is intolerable for patients. Filling out the questionnaire and then using classification method can be the first step in decision-making process, so only in the cases with a high probability of infertility we can use the laboratory tests. In this paper, we evaluated the performance of four classification methods including naive Bayesian, neural network, logistic regression and fuzzy c-means clustering as a classification, in the diagnosis of male infertility due to environmental factors. Since the data are unbalanced, the ROC curves are most suitable method for the comparison. In this paper, we also have selected the more important features using a filtering method and examined the impact of this feature reduction on the performance of each methods; generally, most of the methods had better performance after applying the filter. We have showed that using fuzzy c-means clustering as a classification has a good performance according to the ROC curves and its performance is comparable to other classification methods like logistic regression.

Keywords: classification, fuzzy c-means, logistic regression, Naive Bayesian, neural network, ROC curve

Procedia PDF Downloads 337
3292 Geometric Simplification Method of Building Energy Model Based on Building Performance Simulation

Authors: Yan Lyu, Yiqun Pan, Zhizhong Huang

Abstract:

In the design stage of a new building, the energy model of this building is often required for the analysis of the performance on energy efficiency. In practice, a certain degree of geometric simplification should be done in the establishment of building energy models, since the detailed geometric features of a real building are hard to be described perfectly in most energy simulation engine, such as ESP-r, eQuest or EnergyPlus. Actually, the detailed description is not necessary when the result with extremely high accuracy is not demanded. Therefore, this paper analyzed the relationship between the error of the simulation result from building energy models and the geometric simplification of the models. Finally, the following two parameters are selected as the indices to characterize the geometric feature of in building energy simulation: the southward projected area and total side surface area of the building, Based on the parameterization method, the simplification from an arbitrary column building to a typical shape (a cuboid) building can be made for energy modeling. The result in this study indicates that this simplification would only lead to the error that is less than 7% for those buildings with the ratio of southward projection length to total perimeter of the bottom of 0.25~0.35, which can cover most situations.

Keywords: building energy model, simulation, geometric simplification, design, regression

Procedia PDF Downloads 180
3291 Transfer Learning for Protein Structure Classification at Low Resolution

Authors: Alexander Hudson, Shaogang Gong

Abstract:

Structure determination is key to understanding protein function at a molecular level. Whilst significant advances have been made in predicting structure and function from amino acid sequence, researchers must still rely on expensive, time-consuming analytical methods to visualise detailed protein conformation. In this study, we demonstrate that it is possible to make accurate (≥80%) predictions of protein class and architecture from structures determined at low (>3A) resolution, using a deep convolutional neural network trained on high-resolution (≤3A) structures represented as 2D matrices. Thus, we provide proof of concept for high-speed, low-cost protein structure classification at low resolution, and a basis for extension to prediction of function. We investigate the impact of the input representation on classification performance, showing that side-chain information may not be necessary for fine-grained structure predictions. Finally, we confirm that high resolution, low-resolution and NMR-determined structures inhabit a common feature space, and thus provide a theoretical foundation for boosting with single-image super-resolution.

Keywords: transfer learning, protein distance maps, protein structure classification, neural networks

Procedia PDF Downloads 136
3290 Design of Composite Joints from Carbon Fibre for Automotive Parts

Authors: G. Hemath Kumar, H. Mohit, K. Karthick

Abstract:

One of the most important issues in the composite technology is the repairing of parts of aircraft structures which is manufactured from composite materials. In such applications and also for joining various composite parts together, they are fastened together either using adhesives or mechanical fasteners. The tensile strength of these joints was carried out using Universal Testing Machine (UTM). A parametric study was also conducted to compare the performance of the hybrid joint with varying adherent thickness, adhesive thickness and overlap length. The composition of the material is combination of epoxy resin and carbon fibre under the method of reinforcement. To utilize the full potential of composite materials as structural elements, the strength and stress distribution of these joints must be understood. The study of tensile strength in the members involved under various design conditions and various joints were took place.

Keywords: carbon fiber, FRP composite, MMC, automotive

Procedia PDF Downloads 409
3289 Requirements Definitions of Real-Time System Using the Behavioral Patterns Analysis (BPA) Approach: The Healthcare Multi-Agent System

Authors: Assem El-Ansary

Abstract:

This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach using the Healthcare Multi-Agent System. The Event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are: The Behavioral Pattern Analysis (BPA) modeling methodology. The development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.

Keywords: analysis, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases, Healthcare Multi-Agent System

Procedia PDF Downloads 551
3288 Silicon Nanostructure Based on Metal-Nanoparticle-Assisted Chemical Etching for Photovoltaic Application

Authors: B. Bouktif, M. Gaidi, M. Benrabha

Abstract:

Metal-nano particle-assisted chemical etching is an extraordinary developed wet etching method of producing uniform semiconductor nanostructure (nanowires) from the patterned metallic film on the crystalline silicon surface. The metal films facilitate the etching in HF and H2O2 solution and produce silicon nanowires (SiNWs). Creation of different SiNWs morphologies by changing the etching time and its effects on optical and optoelectronic properties was investigated. Combination effect of formed SiNWs and stain etching treatment in acid (HF/HNO3/H2O) solution on the surface morphology of Si wafers as well as on the optical and optoelectronic properties are presented in this paper.

Keywords: semiconductor nanostructure, chemical etching, optoelectronic property, silicon surface

Procedia PDF Downloads 388
3287 Brief Solution-Focused Negotiation: Theory and Application

Authors: Sapir Handelman

Abstract:

Brief Solution Focused Negotiation is a powerful conflict resolution tool. It can be applied in almost all dimensions of our social life, from politics to family. The initiative invites disputing parties to negotiate practical solutions to their conflict. The negotiation is conducted in a framework of rules, structure, and timeline. The paper presents a model of Brief Solution Focused Negotiation that rests on three pillars: Transformation – turning opposing parties into a negotiating cooperative; Practicality – focusing on practical solutions to a negotiable problem; Discovery – discovering key game changers. This paper introduces these three building blocks. It demonstrates the potential contribution of each one of them to negotiation success. It shows that an effective combination of these three elements has the greatest potential to build, maintain and successfully conclude Brief Solution Focused Negotiation.

Keywords: conflict, negotiation, negotiating cooperative, game changer

Procedia PDF Downloads 84
3286 Protein Remote Homology Detection by Using Profile-Based Matrix Transformation Approaches

Authors: Bin Liu

Abstract:

As one of the most important tasks in protein sequence analysis, protein remote homology detection has been studied for decades. Currently, the profile-based methods show state-of-the-art performance. Position-Specific Frequency Matrix (PSFM) is widely used profile. However, there exists noise information in the profiles introduced by the amino acids with low frequencies. In this study, we propose a method to remove the noise information in the PSFM by removing the amino acids with low frequencies called Top frequency profile (TFP). Three new matrix transformation methods, including Autocross covariance (ACC) transformation, Tri-gram, and K-separated bigram (KSB), are performed on these profiles to convert them into fixed length feature vectors. Combined with Support Vector Machines (SVMs), the predictors are constructed. Evaluated on two benchmark datasets, and experimental results show that these proposed methods outperform other state-of-the-art predictors.

Keywords: protein remote homology detection, protein fold recognition, top frequency profile, support vector machines

Procedia PDF Downloads 125
3285 Through Additive Manufacturing. A New Perspective for the Mass Production of Made in Italy Products

Authors: Elisabetta Cianfanelli, Paolo Pupparo, Maria Claudia Coppola

Abstract:

The recent evolutions in the innovation processes and in the intrinsic tendencies of the product development process, lead to new considerations on the design flow. The instability and complexity that contemporary life describes, defines new problems in the production of products, stimulating at the same time the adoption of new solutions across the entire design process. The advent of Additive Manufacturing, but also of IOT and AI technologies, continuously puts us in front of new paradigms regarding design as a social activity. The totality of these technologies from the point of view of application describes a whole series of problems and considerations immanent to design thinking. Addressing these problems may require some initial intuition and the use of some provisional set of rules or plausible strategies, i.e., heuristic reasoning. At the same time, however, the evolution of digital technology and the computational speed of new design tools describe a new and contrary design framework in which to operate. It is therefore interesting to understand the opportunities and boundaries of the new man-algorithm relationship. The contribution investigates the man-algorithm relationship starting from the state of the art of the Made in Italy model, the most known fields of application are described and then focus on specific cases in which the mutual relationship between man and AI becomes a new driving force of innovation for entire production chains. On the other hand, the use of algorithms could engulf many design phases, such as the definition of shape, dimensions, proportions, materials, static verifications, and simulations. Operating in this context, therefore, becomes a strategic action, capable of defining fundamental choices for the design of product systems in the near future. If there is a human-algorithm combination within a new integrated system, quantitative values can be controlled in relation to qualitative and material values. The trajectory that is described therefore becomes a new design horizon in which to operate, where it is interesting to highlight the good practices that already exist. In this context, the designer developing new forms can experiment with ways still unexpressed in the project and can define a new synthesis and simplification of algorithms, so that each artifact has a signature in order to define in all its parts, emotional and structural. This signature of the designer, a combination of values and design culture, will be internal to the algorithms and able to relate to digital technologies, creating a generative dialogue for design purposes. The result that is envisaged indicates a new vision of digital technologies, no longer understood only as of the custodians of vast quantities of information, but also as a valid integrated tool in close relationship with the design culture.

Keywords: decision making, design euristics, product design, product design process, design paradigms

Procedia PDF Downloads 119
3284 The Application of Raman Spectroscopy in Olive Oil Analysis

Authors: Silvia Portarena, Chiara Anselmi, Chiara Baldacchini, Enrico Brugnoli

Abstract:

Extra virgin olive oil (EVOO) is a complex matrix mainly composed by fatty acid and other minor compounds, among which carotenoids are well known for their antioxidative function that is a key mechanism of protection against cancer, cardiovascular diseases, and macular degeneration in humans. EVOO composition in terms of such constituents is generally the result of a complex combination of genetic, agronomical and environmental factors. To selectively improve the quality of EVOOs, the role of each factor on its biochemical composition need to be investigated. By selecting fruits from four different cultivars similarly grown and harvested, it was demonstrated that Raman spectroscopy, combined with chemometric analysis, is able to discriminate the different cultivars, also as a function of the harvest date, based on the relative content and composition of fatty acid and carotenoids. In particular, a correct classification up to 94.4% of samples, according to the cultivar and the maturation stage, was obtained. Moreover, by using gas chromatography and high-performance liquid chromatography as reference techniques, the Raman spectral features further allowed to build models, based on partial least squares regression, that were able to predict the relative amount of the main fatty acids and the main carotenoids in EVOO, with high coefficients of determination. Besides genetic factors, climatic parameters, such as light exposition, distance from the sea, temperature, and amount of precipitations could have a strong influence on EVOO composition of both major and minor compounds. This suggests that the Raman spectra could act as a specific fingerprint for the geographical discrimination and authentication of EVOO. To understand the influence of environment on EVOO Raman spectra, samples from seven regions along the Italian coasts were selected and analyzed. In particular, it was used a dual approach combining Raman spectroscopy and isotope ratio mass spectrometry (IRMS) with principal component and linear discriminant analysis. A correct classification of 82% EVOO based on their regional geographical origin was obtained. Raman spectra were obtained by Super Labram spectrometer equipped with an Argon laser (514.5 nm wavelenght). Analyses of stable isotope content ratio were performed using an isotope ratio mass spectrometer connected to an elemental analyzer and to a pyrolysis system. These studies demonstrate that RR spectroscopy is a valuable and useful technique for the analysis of EVOO. In combination with statistical analysis, it makes possible the assessment of specific samples’ content and allows for classifying oils according to their geographical and varietal origin.

Keywords: authentication, chemometrics, olive oil, raman spectroscopy

Procedia PDF Downloads 332
3283 Modeling of Sand Boil near the Danube River

Authors: Edina Koch, Károly Gombás, Márton Maller

Abstract:

The Little Plain is located along the Danube river, and this area is a “hotbed” of sand boil formation. This is due to the combination of a 100-250 m thick gravel layer beneath the Little Plain with a relatively thin blanket of poor soil spreading the gravel with variable thickness. Sand boils have a tradition and history in this area. It was possible to know which sand boil started and stopped working at what water level, and some of them even have names. The authors present a 2D finite element model of groundwater flow through a selected cross-section of the Danube river, which observed activation of piping phenomena during the 2013 flood event. Soil parametrization is based on a complex site investigation program conducted along the Danube River in the Little Plain.

Keywords: site characterization, groundwater flow, numerical modeling, sand boil

Procedia PDF Downloads 95
3282 Fast Algorithm to Determine Initial Tsunami Wave Shape at Source

Authors: Alexander P. Vazhenin, Mikhail M. Lavrentiev, Alexey A. Romanenko, Pavel V. Tatarintsev

Abstract:

One of the problems obstructing effective tsunami modelling is the lack of information about initial wave shape at source. The existing methods; geological, sea radars, satellite images, contain an important part of uncertainty. Therefore, direct measurement of tsunami waves obtained at the deep water bottom peruse recorders is also used. In this paper we propose a new method to reconstruct the initial sea surface displacement at tsunami source by the measured signal (marigram) approximation with the help of linear combination of synthetic marigrams from the selected set of unit sources, calculated in advance. This method has demonstrated good precision and very high performance. The mathematical model and results of numerical tests are here described.

Keywords: numerical tests, orthogonal decomposition, Tsunami Initial Sea Surface Displacement

Procedia PDF Downloads 469
3281 Registration of Multi-Temporal Unmanned Aerial Vehicle Images for Facility Monitoring

Authors: Dongyeob Han, Jungwon Huh, Quang Huy Tran, Choonghyun Kang

Abstract:

Unmanned Aerial Vehicles (UAVs) have been used for surveillance, monitoring, inspection, and mapping. In this paper, we present a systematic approach for automatic registration of UAV images for monitoring facilities such as building, green house, and civil structures. The two-step process is applied; 1) an image matching technique based on SURF (Speeded up Robust Feature) and RANSAC (Random Sample Consensus), 2) bundle adjustment of multi-temporal images. Image matching to find corresponding points is one of the most important steps for the precise registration of multi-temporal images. We used the SURF algorithm to find a quick and effective matching points. RANSAC algorithm was used in the process of finding matching points between images and in the bundle adjustment process. Experimental results from UAV images showed that our approach has a good accuracy to be applied to the change detection of facility.

Keywords: building, image matching, temperature, unmanned aerial vehicle

Procedia PDF Downloads 292