Search results for: sequential extraction process
16350 A Goal-Oriented Social Business Process Management Framework
Authors: Mohammad Ehson Rangiha, Bill Karakostas
Abstract:
Social Business Process Management (SBPM) promises to overcome limitations of traditional BPM by allowing flexible process design and enactment through the involvement of users from a social community. This paper proposes a meta-model and architecture for socially driven business process management systems. It discusses the main facets of the architecture such as goal-based role assignment that combines social recommendations with user profile, and process recommendation, through a real example of a charity organization.Keywords: business process management, goal-based modelling, process recommendation social collaboration, social BPM
Procedia PDF Downloads 49216349 Producing Lutein Powder from Algae by Extraction and Drying
Authors: Zexin Lei, Timothy Langrish
Abstract:
Lutein is a type of carotene believed to be beneficial to the eyes. This study aims to explore the possibility of using a closed cycle spray drying system to produce lutein. The system contains a spray dryer, a condenser, a heater, and a pressure seal. Hexane, ethanol, and isopropanol will be used as organic solvents to compare the extraction effects. Several physical and chemical methods of cell disruption will be compared. By continuously sweeping the system with nitrogen, the oxygen content will be controlled below 2%, reducing the concentration of organic solvent below the explosion limit and preventing lutein from being oxidized. Lutein powder will be recovered in the collection device. The volatile organic solvent will be cooled in the condenser and deposited in the bottom until it is discharged from the bottom of the condenser.Keywords: closed cycle spray drying system, Chlorella vulgaris, organic solvent, solvent recovery
Procedia PDF Downloads 13416348 Transfer of Constraints or Constraints on Transfer? Syntactic Islands in Danish L2 English
Authors: Anne Mette Nyvad, Ken Ramshøj Christensen
Abstract:
In the syntax literature, it has standardly been assumed that relative clauses and complement wh-clauses are islands for extraction in English, and that constraints on extraction from syntactic islands are universal. However, the Mainland Scandinavian languages has been known to provide counterexamples. Previous research on Danish has shown that neither relative clauses nor embedded questions are strong islands in Danish. Instead, extraction from this type of syntactic environment is degraded due to structural complexity and it interacts with nonstructural factors such as the frequency of occurrence of the matrix verb, the possibility of temporary misanalysis leading to semantic incongruity and exposure over time. We argue that these facts can be accounted for with parametric variation in the availability of CP-recursion, resulting in the patterns observed, as Danish would then “suspend” the ban on movement out of relative clauses and embedded questions. Given that Danish does not seem to adhere to allegedly universal syntactic constraints, such as the Complex NP Constraint and the Wh-Island Constraint, what happens in L2 English? We present results from a study investigating how native Danish speakers judge extractions from island structures in L2 English. Our findings suggest that Danes transfer their native language parameter setting when asked to judge island constructions in English. This is compatible with the Full Transfer Full Access Hypothesis, as the latter predicts that Danish would have difficulties resetting their [+/- CP-recursion] parameter in English because they are not exposed to negative evidence.Keywords: syntax, islands, second language acquisition, danish
Procedia PDF Downloads 12216347 Competition in Petroleum Extraction and the Challenges of Climate Change
Authors: Saeid Rabiei Majd, Motahareh Alvandi, Bahareh Asefi
Abstract:
Extraction of maximum natural resources is one of the common policies of governments, especially petroleum resources that have high economic and strategic value. The incentive to access and maintain profitable oil markets for governments or international oil companies, causing neglects them to pay attention to environmental principles and sustainable development, which in turn drives up environmental and climate change. Significant damage to the environment can cause severe damage to citizens and indigenous people, such as the compulsory evacuation of their zone due to contamination of water and air resources, destruction of animals and plants. Hawizeh Marshes is a common aquatic and environmental ecosystem along the Iran-Iraq border that also has oil resources. This marsh has been very rich in animal, vegetative, and oil resources. Since 1990, the political motives, the strategic importance of oil extraction, and the disregard for the environmental rights of the Iraqi and Iranian governments in the region have caused 90% of the marshes and forced migration of indigenous people. In this paper, we examine the environmental degradation factors resulting from the adoption of policies and practices of governments in this region based on the principles of environmental rights and sustainable development. Revision of the implementation of the government’s policies and natural resource utilization systems can prevent the spread of climate change, which is a serious international challenge today.Keywords: climate change, indigenous rights, petroleum operation, sustainable development principles, sovereignty on resources
Procedia PDF Downloads 11116346 Motives and Barriers of Using Airbnb: Findings from Mixed Method Approach
Authors: Ghada Mohammed, Mohamed Abdel Salam, Passent Tantawi
Abstract:
The study aimed to investigate the impact of motives and barriers for Egyptian users to use Airbnb as a platform of peer-to-peer accommodation instead of hotels on overall attitude towards Airbnb. A sequential mixed-methods approach was adopted to this study and it proposed a comprehensive research model adapted from both literature and results of qualitative phase and then tested via an online questionnaire. The findings revealed that, motives, price, home benefits, privacy, and online reviews significantly explained overall attitude towards Airbnb, while the main barriers were respectively: perceived risk and distrust in which they can predict the overall attitude. While from the subjective norms, only social influence can predict behavioral intention to use Airbnb. The study may serve as a practical reference for practitioners as well as researchers when developing programs and strategies to manage Airbnb consumers' needs and decision process. Some of the main conclusions drawn from this study are that variety was one of the major things that users like about Airbnb and the most important motives are the functional ones like price rather than the experiential ones like authenticity.Keywords: airbnb, barriers, disruptive innovation, motives, sharing economy
Procedia PDF Downloads 14516345 Monitoring Soil Organic Amendments Under Arid Climate: Evolution of Soil Quality and of Two Consecutive Barley Crops
Authors: Houda Oueriemmi, Petra Susan Kidd, Carmen Trasar-Cepeda, Beatriz Rodríguez-Garrido, Mohamed Moussa, Ángeles Prieto-Fernández, Mohamed Ouessar
Abstract:
Organic amendments are generally used for improving the fertility of arid and semi-arid soils. However, the price of farmyard manure, the organic amendment typically applied to many arid and semi-arid soils has highly increased in the last years. To investigate at field scale whether cheap, highly available organic amendments, such as sewage sludge compost and municipal solid waste compost, may be acceptable as substitutes for farmyard manure is therefore of great interest. A field plots experiment was carried out to assess the effects of a single application of three organic amendments on soil fertility, distribution of trace elements and on barley yield. Municipal solid waste compost (MSWC), farmyard manure (FYM) and sewage sludge compost (SSC) were applied at rates of 0, 20, 40 and 60 t ha⁻¹, and barley was cultivated in two consecutive years. Plant samples and soils were collected for laboratory analyses after two consecutive harvests. Compared with unamended soil, the application of the three organic residues improved the fertility of the topsoil, showing a significant dose-dependent increase of TOC, N, P contents up to the highest dose of 60 t ha⁻¹ (0.74%, 0.06% and 40 mg kg⁻¹, respectively). The enhancement of soil nutrient status impacted positively on grain yield (up to 51%). The distribution of trace elements in the soil, analysed by a sequential extraction procedure, revealed that the MSWC increased the acid-extractable Co and Cu and reducible Ni, while SSC increased reducible Co and Ni and oxidisable Cu, relative to the control soil.Keywords: municipal solid waste compost, sewage sludge compost, fertility, trace metals
Procedia PDF Downloads 8616344 The Studies of the Sorption Capabilities of the Porous Microspheres with Lignin
Authors: M. Goliszek, M. Sobiesiak, O. Sevastyanova, B. Podkoscielna
Abstract:
Lignin is one of three main constituents of biomass together with cellulose and hemicellulose. It is a complex biopolymer, which contains a large number of functional groups, including aliphatic and aromatic hydroxyl groups, carbohylic groups and methoxy groups in its structure, that is why it shows potential capacities for process of sorption. Lignin is a highly cross-linked polymer with a three-dimentional structure which can provide large surface area and pore volumes. It can also posses better dispersion, diffusion and mass transfer behavior in a field of the removal of, e.g., heavy-metal-ions or aromatic pollutions. In this work emulsion-suspension copolymerization method, to synthesize the porous microspheres of divinylbenzene (DVB), styrene (St) and lignin was used. There are also microspheres without the addition of lignin for comparison. Before the copolymerization, modification lignin with methacryloyl chloride, to improve its reactivity with other monomers was done. The physico-chemical properties of the obtained microspheres, e.g., pore structures (adsorption-desorption measurements), thermal properties (DSC), tendencies to swell and the actual shapes were also studied. Due to well-developed porous structure and the presence of functional groups our materials may have great potential in sorption processes. To estimate the sorption capabilities of the microspheres towards phenol and its chlorinated derivatives the off-line SPE (solid-phase extraction) method is going to be applied. This method has various advantages, including low-cost, easy to use and enables the rapid measurements for a large number of chemicals. The efficiency of the materials in removing phenols from aqueous solution and in desorption processes will be evaluated.Keywords: microspheres, lignin, sorption, solid-phase extraction
Procedia PDF Downloads 18116343 Terrain Classification for Ground Robots Based on Acoustic Features
Authors: Bernd Kiefer, Abraham Gebru Tesfay, Dietrich Klakow
Abstract:
The motivation of our work is to detect different terrain types traversed by a robot based on acoustic data from the robot-terrain interaction. Different acoustic features and classifiers were investigated, such as Mel-frequency cepstral coefficient and Gamma-tone frequency cepstral coefficient for the feature extraction, and Gaussian mixture model and Feed forward neural network for the classification. We analyze the system’s performance by comparing our proposed techniques with some other features surveyed from distinct related works. We achieve precision and recall values between 87% and 100% per class, and an average accuracy at 95.2%. We also study the effect of varying audio chunk size in the application phase of the models and find only a mild impact on performance.Keywords: acoustic features, autonomous robots, feature extraction, terrain classification
Procedia PDF Downloads 36616342 Well-Defined Polypeptides: Synthesis and Selective Attachment of Poly(ethylene glycol) Functionalities
Authors: Cristina Lavilla, Andreas Heise
Abstract:
The synthesis of sequence-controlled polymers has received increasing attention in the last years. Well-defined polyacrylates, polyacrylamides and styrene-maleimide copolymers have been synthesized by sequential or kinetic addition of comonomers. However this approach has not yet been introduced to the synthesis of polypeptides, which are in fact polymers developed by nature in a sequence-controlled way. Polypeptides are natural materials that possess the ability to self-assemble into complex and highly ordered structures. Their folding and properties arise from precisely controlled sequences and compositions in their constituent amino acid monomers. So far, solid-phase peptide synthesis is the only technique that allows preparing short peptide sequences with excellent sequence control, but also requires extensive protection/deprotection steps and it is a difficult technique to scale-up. A new strategy towards sequence control in the synthesis of polypeptides is introduced, based on the sequential addition of α-amino acid-N-carboxyanhydrides (NCAs). The living ring-opening process is conducted to full conversion and no purification or deprotection is needed before addition of a new amino acid. The length of every block is predefined by the NCA:initiator ratio in every step. This method yields polypeptides with a specific sequence and controlled molecular weights. A series of polypeptides with varying block sequences have been synthesized with the aim to identify structure-property relationships. All of them are able to adopt secondary structures similar to natural polypeptides, and display properties in the solid state and in solution that are characteristic of the primary structure. By design the prepared polypeptides allow selective modification of individual block sequences, which has been exploited to introduce functionalities in defined positions along the polypeptide chain. Poly(ethylene glycol)(PEG) was the functionality chosen, as it is known to favor hydrophilicity and also yield thermoresponsive materials. After PEGylation, hydrophilicity of the polypeptides is enhanced, and their thermal response in H2O has been studied. Noteworthy differences in the behavior of the polypeptides having different sequences have been found. Circular dichroism measurements confirmed that the α-helical conformation is stable over the examined temperature range (5-90 °C). It is concluded that PEG units are the main responsible of the changes in H-bonding interactions with H2O upon variation of temperature, and the position of these functional units along the backbone is a factor of utmost importance in the resulting properties of the α-helical polypeptides.Keywords: α-amino acid N-carboxyanhydrides, multiblock copolymers, poly(ethylene glycol), polypeptides, ring-opening polymerization, sequence control
Procedia PDF Downloads 19916341 The Effect of Elapsed Time on the Cardiac Troponin-T Degradation and Its Utility as a Time Since Death Marker in Cases of Death Due to Burn
Authors: Sachil Kumar, Anoop K.Verma, Uma Shankar Singh
Abstract:
It’s extremely important to study postmortem interval in different causes of death since it assists in a great way in making an opinion on the exact cause of death following such incident often times. With diligent knowledge of the interval one could really say as an expert that the cause of death is not feigned hence there is a great need in evaluating such death to have been at the CRIME SCENE before performing an autopsy on such body. The approach described here is based on analyzing the degradation or proteolysis of a cardiac protein in cases of deaths due to burn as a marker of time since death. Cardiac tissue samples were collected from (n=6) medico-legal autopsies, (Department of Forensic Medicine and Toxicology), King George’s Medical University, Lucknow India, after informed consent from the relatives and studied post-mortem degradation by incubation of the cardiac tissue at room temperature (20±2 OC) for different time periods (~7.30, 18.20, 30.30, 41.20, 41.40, 54.30, 65.20, and 88.40 Hours). The cases included were the subjects of burn without any prior history of disease who died in the hospital and their exact time of death was known. The analysis involved extraction of the protein, separation by denaturing gel electrophoresis (SDS-PAGE) and visualization by Western blot using cTnT specific monoclonal antibodies. The area of the bands within a lane was quantified by scanning and digitizing the image using Gel Doc. As time postmortem progresses the intact cTnT band degrades to fragments that are easily detected by the monoclonal antibodies. A decreasing trend in the level of cTnT (% of intact) was found as the PM hours increased. A significant difference was observed between <15 h and other PM hours (p<0.01). Significant difference in cTnT level (% of intact) was also observed between 16-25 h and 56-65 h & >75 h (p<0.01). Western blot data clearly showed the intact protein at 42 kDa, three major (28 kDa, 30kDa, 10kDa) fragments, three additional minor fragments (12 kDa, 14kDa, and 15 kDa) and formation of low molecular weight fragments. Overall, both PMI and cardiac tissue of burned corpse had a statistically significant effect where the greatest amount of protein breakdown was observed within the first 41.40 Hrs and after it intact protein slowly disappears. If the percent intact cTnT is calculated from the total area integrated within a Western blot lane, then the percent intact cTnT shows a pseudo-first order relationship when plotted against the time postmortem. A strong significant positive correlation was found between cTnT and PM hours (r=0.87, p=0.0001). The regression analysis showed a good variability explained (R2=0.768) The post-mortem Troponin-T fragmentation observed in this study reveals a sequential, time-dependent process with the potential for use as a predictor of PMI in cases of burning.Keywords: burn, degradation, postmortem interval, troponin-T
Procedia PDF Downloads 44816340 Signature Verification System for a Banking Business Process Management
Authors: A. Rahaf, S. Liyakathunsia
Abstract:
In today’s world, unprecedented operational pressure is faced by banks that test the efficiency, effectiveness, and agility of their business processes. In a typical banking process, a person’s authorization is usually based on his signature on most all of the transactions. Signature verification is considered as one of the highly significant information needed for any bank document processing. Banks usually use Signature Verification to authenticate the identity of individuals. In this paper, a business process model has been proposed in order to increase the quality of the verification process and to reduce time and needed resources. In order to understand the current process, a survey has been conducted and distributed among bank employees. After analyzing the survey, a process model has been created using Bizagi modeler which helps in simulating the process after assigning time and cost of it. The outcomes show that the automation of signature verification process is highly recommended for a banking business process.Keywords: business process management, process modeling, quality, Signature Verification
Procedia PDF Downloads 42116339 Demetallization of Crude Oil: Comparative Analysis of Deasphalting and Electrochemical Removal Methods of Ni and V
Authors: Nurlan Akhmetov, Abilmansur Yeshmuratov, Aliya Kurbanova, Gulnar Sugurbekova, Murat Baisariyev
Abstract:
Extraction of the vanadium and nickel compounds is complex due to the high stability of porphyrin, nickel is catalytic poison which deactivates catalysis during the catalytic cracking of the oil, while vanadyl is abrasive and valuable metal. Thus, high concentration of the Ni and V in the crude oil makes their removal relevant. Two methods of the demetallization of crude oil were tested, therefore, the present research is conducted for comparative analysis of the deasphalting with organic solvents (cyclohexane, carbon tetrachloride, chloroform) and electrochemical method. Percentage of Ni extraction reached maximum of approximately 55% by using the electrochemical method in electrolysis cell, which was developed for this research and consists of three sections: oil and protonating agent (EtOH) solution between two conducting membranes which divides it from two capsules of 10% sulfuric acid and two graphite electrodes which cover all three parts in electrical circuit. Ions of metals pass through membranes and remain in acid solutions. The best result was obtained in 60 minutes with ethanol to oil ratio 25% to 75% respectively, current fits in to the range from 0.3A to 0.4A, voltage changed from 12.8V to 17.3V. Maximum efficiency of deasphalting, with cyclohexane as the solvent, in Soxhlet extractor was 66.4% for Ni and 51.2% for V. Thus, applying the voltammetry, ICP MS (Inductively coupled plasma mass spectrometry) and AAS (atomic absorption spectroscopy), these mentioned types of metal extraction methods were compared in this paper.Keywords: electrochemistry, deasphalting of crude oil, demetallization of crude oil, petrolium engineering
Procedia PDF Downloads 23216338 A Deep Learning Approach to Subsection Identification in Electronic Health Records
Authors: Nitin Shravan, Sudarsun Santhiappan, B. Sivaselvan
Abstract:
Subsection identification, in the context of Electronic Health Records (EHRs), is identifying the important sections for down-stream tasks like auto-coding. In this work, we classify the text present in EHRs according to their information, using machine learning and deep learning techniques. We initially describe briefly about the problem and formulate it as a text classification problem. Then, we discuss upon the methods from the literature. We try two approaches - traditional feature extraction based machine learning methods and deep learning methods. Through experiments on a private dataset, we establish that the deep learning methods perform better than the feature extraction based Machine Learning Models.Keywords: deep learning, machine learning, semantic clinical classification, subsection identification, text classification
Procedia PDF Downloads 21516337 Words Spotting in the Images Handwritten Historical Documents
Authors: Issam Ben Jami
Abstract:
Information retrieval in digital libraries is very important because most famous historical documents occupy a significant value. The word spotting in historical documents is a very difficult notion, because automatic recognition of such documents is naturally cursive, it represents a wide variability in the level scale and translation words in the same documents. We first present a system for the automatic recognition, based on the extraction of interest points words from the image model. The extraction phase of the key points is chosen from the representation of the image as a synthetic description of the shape recognition in a multidimensional space. As a result, we use advanced methods that can find and describe interesting points invariant to scale, rotation and lighting which are linked to local configurations of pixels. We test this approach on documents of the 15th century. Our experiments give important results.Keywords: feature matching, historical documents, pattern recognition, word spotting
Procedia PDF Downloads 27316336 New Approaches for the Handwritten Digit Image Features Extraction for Recognition
Authors: U. Ravi Babu, Mohd Mastan
Abstract:
The present paper proposes a novel approach for handwritten digit recognition system. The present paper extract digit image features based on distance measure and derives an algorithm to classify the digit images. The distance measure can be performing on the thinned image. Thinning is the one of the preprocessing technique in image processing. The present paper mainly concentrated on an extraction of features from digit image for effective recognition of the numeral. To find the effectiveness of the proposed method tested on MNIST database, CENPARMI, CEDAR, and newly collected data. The proposed method is implemented on more than one lakh digit images and it gets good comparative recognition results. The percentage of the recognition is achieved about 97.32%.Keywords: handwritten digit recognition, distance measure, MNIST database, image features
Procedia PDF Downloads 46016335 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data
Authors: Adarsh Shroff
Abstract:
Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.Keywords: big data, map reduce, incremental processing, iterative computation
Procedia PDF Downloads 34916334 The Use of a Rabbit Model to Evaluate the Influence of Age on Excision Wound Healing
Authors: S. Bilal, S. A. Bhat, I. Hussain, J. D. Parrah, S. P. Ahmad, M. R. Mir
Abstract:
Background: The wound healing involves a highly coordinated cascade of cellular and immunological response over a period including coagulation, inflammation, granulation tissue formation, epithelialization, collagen synthesis and tissue remodeling. Wounds in aged heal more slowly than those in younger, mainly because of comorbidities that occur as one age. The present study is about the influence of age on wound healing. 1x1cm^2 (100 mm) wounds were created on the back of the animal. The animals were divided into two groups; one group had animals in the age group of 3-9 months while another group had animals in the age group of 15-21 months. Materials and Methods: 24 clinically healthy rabbits in the age group of 3-21 months were used as experimental animals and divided into two groups viz A and B. All experimental parameters, i.e., Excision wound model, Measurement of wound area, Protein extraction and estimation, Protein extraction and estimation and DNA extraction and estimation were done by standard methods. Results: The parameters studied were wound contraction, hydroxyproline, glucosamine, protein, and DNA. A significant increase (p<0.005) in the hydroxyproline, glucosamine, protein and DNA and a significant decrease in wound area (p<0.005) was observed in the age group of 3-9 months when compared to animals of an age group of 15-21 months. Wound contraction together with hydroxyproline, glucosamine, protein and DNA estimations suggest that advanced age results in retarded wound healing. Conclusion: The decrease wound contraction and accumulation of hydroxyproline, glucosamine, protein and DNA in group B animals may be associated with the reduction or delay in growth factors because of the advancing age.Keywords: age, wound healing, excision wound, hydroxyproline, glucosamine
Procedia PDF Downloads 65816333 A Computer-Aided System for Tooth Shade Matching
Authors: Zuhal Kurt, Meral Kurt, Bilge T. Bal, Kemal Ozkan
Abstract:
Shade matching and reproduction is the most important element of success in prosthetic dentistry. Until recently, shade matching procedure was implemented by dentists visual perception with the help of shade guides. Since many factors influence visual perception; tooth shade matching using visual devices (shade guides) is highly subjective and inconsistent. Subjective nature of this process has lead to the development of instrumental devices. Nowadays, colorimeters, spectrophotometers, spectroradiometers and digital image analysing systems are used for instrumental shade selection. Instrumental devices have advantages that readings are quantifiable, can obtain more rapidly and simply, objectively and precisely. However, these devices have noticeable drawbacks. For example, translucent structure and irregular surfaces of teeth lead to defects on measurement with these devices. Also between the results acquired by devices with different measurement principles may make inconsistencies. So, its obligatory to search for new methods for dental shade matching process. A computer-aided system device; digital camera has developed rapidly upon today. Currently, advances in image processing and computing have resulted in the extensive use of digital cameras for color imaging. This procedure has a much cheaper process than the use of traditional contact-type color measurement devices. Digital cameras can be taken by the place of contact-type instruments for shade selection and overcome their disadvantages. Images taken from teeth show morphology and color texture of teeth. In last decades, a new method was recommended to compare the color of shade tabs taken by a digital camera using color features. This method showed that visual and computer-aided shade matching systems should be used as concatenated. Recently using methods of feature extraction techniques are based on shape description and not used color information. However, color is mostly experienced as an essential property in depicting and extracting features from objects in the world around us. When local feature descriptors with color information are extended by concatenating color descriptor with the shape descriptor, that descriptor will be effective on visual object recognition and classification task. Therefore, the color descriptor is to be used in combination with a shape descriptor it does not need to contain any spatial information, which leads us to use local histograms. This local color histogram method is remain reliable under variation of photometric changes, geometrical changes and variation of image quality. So, coloring local feature extraction methods are used to extract features, and also the Scale Invariant Feature Transform (SIFT) descriptor used to for shape description in the proposed method. After the combination of these descriptors, the state-of-art descriptor named by Color-SIFT will be used in this study. Finally, the image feature vectors obtained from quantization algorithm are fed to classifiers such as Nearest Neighbor (KNN), Naive Bayes or Support Vector Machines (SVM) to determine label(s) of the visual object category or matching. In this study, SVM are used as classifiers for color determination and shade matching. Finally, experimental results of this method will be compared with other recent studies. It is concluded from the study that the proposed method is remarkable development on computer aided tooth shade determination system.Keywords: classifiers, color determination, computer-aided system, tooth shade matching, feature extraction
Procedia PDF Downloads 44316332 Youth and Radicalization: Main Causes Who Lead Young People to Radicalize in a Context with Background of Radicalization
Authors: Zineb Emrane
Abstract:
This abstract addresses the issue of radicalization of young people in a context with background of radicalization, in North of Morocco, 5 terrorist of Madrid's Attacts on 11th March, were coming from this context. It were developed a study pilot that describing young people perception about the main causes that lead and motivate for radicalization. Whenever we talk about this topic, we obtain information from studies and investigations by specialists in field, but we don’t give voice to the protagonists who in many cases are victims, specifically, young people at social risk because of social factors. Extremist radicalization is an expanding phenomenon, that affect young people, in north of Morocco. They live in a context with radical background and at risk of social exclusion, their social, economic and familiar needs make them vulnerable. The extremist groups take advantage of this vulnerability to involve them in a process of radicalization, offering them an alternative environment where they can found all they are looking for. This study pilot approaches the main causes that lead and motivates young people to become radicals, analyzing their context with emphasis on influencing factors, and bearing in mind the analysis of young people about how the radical background affect them and their opinion this phenomenon. The pilot study was carried out through the following actions: - Group dynamics with young people to analyze the process of violent radicalization of young people. -A participatory workshop with members of organizations that work directly with young people at risk of radicalization. -Interviews with institutional managers -Participant observation. The implementation of actions has led to the conclusion that young people define violent radicalization as a sequential process, depending on the stage, it can be deconstructed. Young people recognize that they stop feeling belonging to their family, school and neighborhood when they see behavior contrary to what they consider good and evil. The emotional rupture and the search for references outside their circle, push them to sympathize with groups that have an extremist ideology and that offer them what they need. The radicalization is a process with different stages, the main causes and the factors which lead young people to use extremist violence are related their low level of belonging feeling to their context, and lack of critical thinking about important issues. The young people are in a vulnerable stage, searching their identity, a space in which they can be accepted, and when they don't find it they are easily manipulated and susceptible to being attracted by extremist groups.Keywords: exclusion, radicalization, vulnerability, youth
Procedia PDF Downloads 16116331 Synthesis of Liposomal Vesicles by a Novel Supercritical Fluid Process
Authors: Wen-Chyan Tsai, Syed S. H. Rizvi
Abstract:
Organic solvent residues are always associated with liposomes produced by the traditional techniques like the thin film hydration and reverse phase evaporation methods, which limit the applications of these vesicles in the pharmaceutical, food and cosmetic industries. Our objective was to develop a novel and benign process of liposomal microencapsulation by using supercritical carbon dioxide (SC-CO2) as the sole phospholipid-dissolving medium and a green substitute for organic solvents. This process consists of supercritical fluid extraction followed by rapid expansion via a nozzle and automatic cargo suction. Lecithin and cholesterol mixed in 10:1 mass ratio were dissolved in SC-CO2 at 20 ± 0.5 MPa and 60 oC. After at least two hours of equilibrium, the lecithin/cholesterol-laden SC-CO2 was passed through a 1000-micron nozzle and immediately mixed with the cargo solution to form liposomes. Liposomal micro-encapsulation was conducted at three pressures (8.27, 12.41, 16.55 MPa), three temperatures (75, 83 and 90 oC) and two flow rates (0.25 ml/sec and 0.5 ml/sec). Liposome size, zeta potential and encapsulation efficiency were characterized as functions of the operating parameters. The average liposomal size varied from 400-500 nm to 1000-1200 nm when the pressure was increased from 8.27 to 16.55 MPa. At 12.41 MPa, 90 oC and 0.25 ml per second of 0.2 M glucose cargo loading rate, the highest encapsulation efficiency of 31.65 % was achieved. Under a confocal laser scanning microscope, large unilamellar vesicles and multivesicular vesicles were observed to make up a majority of the liposomal emulsion. This new approach is a rapid and continuous process for bulk production of liposomes using a green solvent. Based on the results to date, it is feasible to apply this technique to encapsulate hydrophilic compounds inside the aqueous core as well as lipophilic compounds in the phospholipid bilayers of the liposomes for controlled release, solubility improvement and targeted therapy of bioactive compounds.Keywords: liposome, micro encapsulation, supercritical carbon dioxide, non-toxic process
Procedia PDF Downloads 42816330 Comparison of Machine Learning and Deep Learning Algorithms for Automatic Classification of 80 Different Pollen Species
Authors: Endrick Barnacin, Jean-Luc Henry, Jimmy Nagau, Jack Molinie
Abstract:
Palynology is a field of interest in many disciplines due to its multiple applications: chronological dating, climatology, allergy treatment, and honey characterization. Unfortunately, the analysis of a pollen slide is a complicated and time consuming task that requires the intervention of experts in the field, which are becoming increasingly rare due to economic and social conditions. That is why the need for automation of this task is urgent. A lot of studies have investigated the subject using different standard image processing descriptors and sometimes hand-crafted ones.In this work, we make a comparative study between classical feature extraction methods (Shape, GLCM, LBP, and others) and Deep Learning (CNN, Autoencoders, Transfer Learning) to perform a recognition task over 80 regional pollen species. It has been found that the use of Transfer Learning seems to be more precise than the other approachesKeywords: pollens identification, features extraction, pollens classification, automated palynology
Procedia PDF Downloads 13516329 Keyframe Extraction Using Face Quality Assessment and Convolution Neural Network
Authors: Rahma Abed, Sahbi Bahroun, Ezzeddine Zagrouba
Abstract:
Due to the huge amount of data in videos, extracting the relevant frames became a necessity and an essential step prior to performing face recognition. In this context, we propose a method for extracting keyframes from videos based on face quality and deep learning for a face recognition task. This method has two steps. We start by generating face quality scores for each face image based on the use of three face feature extractors, including Gabor, LBP, and HOG. The second step consists in training a Deep Convolutional Neural Network in a supervised manner in order to select the frames that have the best face quality. The obtained results show the effectiveness of the proposed method compared to the methods of the state of the art.Keywords: keyframe extraction, face quality assessment, face in video recognition, convolution neural network
Procedia PDF Downloads 22916328 Antibacterial and Antioxidant Properties of Total Phenolics from Waste Orange Peels
Authors: Kanika Kalra, Harmeet Kaur, Dinesh Goyal
Abstract:
Total phenolics were extracted from waste orange peels by solvent extraction and alkali hydrolysis method. The most efficient solvents for extracting phenolic compounds from waste biomass were methanol (60%) > dimethyl sulfoxide > ethanol (60%) > distilled water. The extraction yields were significantly impacted by solvents (ethanol, methanol, and dimethyl sulfoxide) due to varying polarity and concentrations. Extraction of phenolics using 60% methanol yielded the highest phenolics (in terms of gallic acid equivalent (GAE) per gram of biomass) in orange peels. Alkali hydrolyzed extract from orange peels contained 7.58±0.33 mg GAE g⁻¹. By using the solvent extraction technique, it was observed that 60% methanol is comparatively the best-suited solvent for extracting polyphenolic compounds and gave the maximum yield of 4.68 ± 0.47 mg GAE g⁻¹ in orange peel extracts. DPPH radical scavenging activity and reducing the power of orange peel extract were checked, where 60% methanolic extract showed the highest antioxidant activity, 85.50±0.009% for DPPH, and dimethyl sulfoxide (DMSO) extract gave the highest yield of 1.75±0.01% for reducing power ability of the orange peels extract. Characterization of the polyphenolic compounds was done by using Fourier transformation infrared (FTIR) spectroscopy. Solvent and alkali hydrolysed extracts were evaluated for antibacterial activity using the agar well diffusion method against Gram-positive Bacillus subtilis MTCC441 and Gram-negative Escherichia coli MTCC729. Methanolic extract at 300µl concentration showed an inhibition zone of around 16.33±0.47 mm against Bacillus subtilis, whereas, for Escherichia coli, it was comparatively less. Broth-based turbidimetric assay revealed the antibacterial effect of different volumes of orange peel extracts against both organisms.Keywords: orange peels, total phenolic content, antioxidant, antibacterial
Procedia PDF Downloads 7016327 Identification of Coauthors in Scientific Database
Authors: Thiago M. R Dias, Gray F. Moita
Abstract:
The analysis of scientific collaboration networks has contributed significantly to improving the understanding of how does the process of collaboration between researchers and also to understand how the evolution of scientific production of researchers or research groups occurs. However, the identification of collaborations in large scientific databases is not a trivial task given the high computational cost of the methods commonly used. This paper proposes a method for identifying collaboration in large data base of curriculum researchers. The proposed method has low computational cost with satisfactory results, proving to be an interesting alternative for the modeling and characterization of large scientific collaboration networks.Keywords: extraction, data integration, information retrieval, scientific collaboration
Procedia PDF Downloads 39516326 Assessment of Forest Resource Exploitation in the Rural Communities of District Jhelum
Authors: Rubab Zafar Kahlon, Ibtisam Butt
Abstract:
Forest resources are deteriorating and experiencing decline around the globe due to unsustainable use and over exploitation. The present study was an attempt to determine the relationship between human activities, forest resource utilization, extraction methods and practices of forest resource exploitation in the district Jhelum of Pakistan. For this purpose, primary sources of data were used which were collected from 8 villages through structured questionnaire and tabulated in Microsoft Excel 365 and SPSS 22 was used for multiple linear regression analysis. The results revealed that farming, wood cutting, animal husbandry and agro-forestry were the major occupations in the study area. Most commonly used resources included timber 26%, fuelwood 25% and fodder 19%. Methods used for resource extraction included gathering 49%, plucking 34% trapping 11% and cutting 6%. Population growth, increased demand of fuelwood and land conversion were the main reasons behind forest degradation. Results for multiple linear regression revealed that Forest based activities, sources of energy production, methods used for wood harvesting and resource extraction and use of fuelwood for energy production contributed significantly towards extensive forest resource exploitation with p value <0.5 within the study area. The study suggests that effective measures should be taken by forest department to control the unsustainable use of forest resources by stringent management interventions and awareness campaigns in Jhelum district.Keywords: forest resource, biodiversity, expliotation, human activities
Procedia PDF Downloads 9116325 FPGA Implementation of the BB84 Protocol
Authors: Jaouadi Ikram, Machhout Mohsen
Abstract:
The development of a quantum key distribution (QKD) system on a field-programmable gate array (FPGA) platform is the subject of this paper. A quantum cryptographic protocol is designed based on the properties of quantum information and the characteristics of FPGAs. The proposed protocol performs key extraction, reconciliation, error correction, and privacy amplification tasks to generate a perfectly secret final key. We modeled the presence of the spy in our system with a strategy to reveal some of the exchanged information without being noticed. Using an FPGA card with a 100 MHz clock frequency, we have demonstrated the evolution of the error rate as well as the amounts of mutual information (between the two interlocutors and that of the spy) passing from one step to another in the key generation process.Keywords: QKD, BB84, protocol, cryptography, FPGA, key, security, communication
Procedia PDF Downloads 18116324 Effect of Hemicellulase on Extraction of Essential Oil from Algerian Artemisia campestris
Authors: Khalida Boutemak, Nasssima Benali, Nadji Moulai-Mostefa
Abstract:
Effect of enzyme on the yield and chemical composition of Artemisia campestris essential oil is reported in the present study. It was demonstrated that enzyme facilitated the extraction of essential oil with increase in oil yield and did not affect any noticeable change in flavour profile of the volatile oil. Essential oil was tested for antibacterial activity using Escherichia coli; which was extremely sensitive against control with the largest inhibition (29mm), whereas Staphylococcus aureus was the most sensitive against essential oil obtained from enzymatic pre-treatment with the largest inhibition zone (25mm). The antioxidant activity of the essential oil with hemicellulase pre-treatment (EO2) and control sample (EO1) was determined through reducing power. It was significantly lower than the standard drug (vitamin C) in this order: vitamin C˃EO2˃EO1.Keywords: Artemisia campestris, enzyme pre-treatment, hemicellulase, antibacterial activity, antioxidant activity
Procedia PDF Downloads 32816323 Intrusion Detection System Using Linear Discriminant Analysis
Authors: Zyad Elkhadir, Khalid Chougdali, Mohammed Benattou
Abstract:
Most of the existing intrusion detection systems works on quantitative network traffic data with many irrelevant and redundant features, which makes detection process more time’s consuming and inaccurate. A several feature extraction methods, such as linear discriminant analysis (LDA), have been proposed. However, LDA suffers from the small sample size (SSS) problem which occurs when the number of the training samples is small compared with the samples dimension. Hence, classical LDA cannot be applied directly for high dimensional data such as network traffic data. In this paper, we propose two solutions to solve SSS problem for LDA and apply them to a network IDS. The first method, reduce the original dimension data using principal component analysis (PCA) and then apply LDA. In the second solution, we propose to use the pseudo inverse to avoid singularity of within-class scatter matrix due to SSS problem. After that, the KNN algorithm is used for classification process. We have chosen two known datasets KDDcup99 and NSLKDD for testing the proposed approaches. Results showed that the classification accuracy of (PCA+LDA) method outperforms clearly the pseudo inverse LDA method when we have large training data.Keywords: LDA, Pseudoinverse, PCA, IDS, NSL-KDD, KDDcup99
Procedia PDF Downloads 22516322 A Multi-Family Offline SPE LC-MS/MS Analytical Method for Anionic, Cationic and Non-ionic Surfactants in Surface Water
Authors: Laure Wiest, Barbara Giroud, Azziz Assoumani, Francois Lestremau, Emmanuelle Vulliet
Abstract:
Due to their production at high tonnages and their extensive use, surfactants are contaminants among those determined at the highest concentrations in wastewater. However, analytical methods and data regarding their occurrence in river water are scarce and concern only a few families, mainly anionic surfactants. The objective of this study was to develop an analytical method to extract and analyze a wide variety of surfactants in a minimum of steps, with a sensitivity compatible with the detection of ultra-traces in surface waters. 27 substances, from 12 families of surfactants, anionic, cationic and non-ionic were selected for method optimization. Different retention mechanisms for the extraction by solid phase extraction (SPE) were tested and compared in order to improve their detection by liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). The best results were finally obtained with a C18 grafted silica LC column and a polymer cartridge with hydrophilic lipophilic balance (HLB), and the method developed allows the extraction of the three types of surfactants with satisfactory recoveries. The final analytical method comprised only one extraction and two LC injections. It was validated and applied for the quantification of surfactants in 36 river samples. The method's limits of quantification (LQ), intra- and inter-day precision and accuracy were evaluated, and good performances were obtained for the 27 substances. As these compounds have many areas of application, contaminations of instrument and method blanks were observed and considered for the determination of LQ. Nevertheless, with LQ between 15 and 485 ng/L, and accuracy of over 80%, this method was suitable for monitoring surfactants in surface waters. Application on French river samples revealed the presence of anionic, cationic and non-ionic surfactants with median concentrations ranging from 24 ng/L for octylphenol ethoxylates (OPEO) to 4.6 µg/L for linear alkylbenzenesulfonates (LAS). The analytical method developed in this work will therefore be useful for future monitoring of surfactants in waters. Moreover, this method, which shows good performances for anionic, non-ionic and cationic surfactants, may be easily adapted to other surfactants.Keywords: anionic surfactant, cationic surfactant, LC-MS/MS, non-ionic surfactant, SPE, surface water
Procedia PDF Downloads 14416321 Gas Chromatography Coupled to Tandem Mass Spectrometry and Liquid Chromatography Coupled to Tandem Mass Spectrometry Qualitative Determination of Pesticides Found in Tea Infusions
Authors: Mihai-Alexandru Florea, Veronica Drumea, Roxana Nita, Cerasela Gird, Laura Olariu
Abstract:
The aim of this study was to investigate the residues of pesticide found in tea water infusions. A multi-residues method to determine 147 pesticides has been developed using the QuEChERS (Quick, Easy, Cheap, Effective, Rugged, Safe) procedure and dispersive solid phase extraction (d-SPE) for the cleanup the pesticides from complex matrices such as plants and tea. Sample preparation was carefully optimized for the efficient removal of coextracted matrix components by testing more solvent systems. Determination of pesticides was performed using GC-MS/MS (100 of pesticides) and LC-MS/MS (47 of pesticides). The selected reaction monitoring (SRM) mode was chosen to achieve low detection limits and high compounds selectivity and sensitivity. Overall performance was evaluated and validated according to DG-SANTE Guidelines. To assess the pesticide residue transfer rate (qualitative) from dried tea in infusions the samples (tea) were spiked with a mixture of pesticides at the maximum residues level accepted for teas and herbal infusions. In order to investigate the release of the pesticides in tea preparations, the medicinal plants were prepared in four ways by variation of water temperature and the infusion time. The pesticides from infusions were extracted using two methods: QuEChERS versus solid-phase extraction (SPE). More that 90 % of the pesticides studied was identified in infusion.Keywords: tea, solid-phase extraction (SPE), selected reaction monitoring (SRM), QuEChERS
Procedia PDF Downloads 212