Search results for: welding process selection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17058

Search results for: welding process selection

12618 Flocculation on the Treatment of Olive Oil Mill Wastewater: Pre-Treatment

Authors: G. Hodaifa, J. A. Páez, C. Agabo, E. Ramos, J. C. Gutiérrez, A. Rosal

Abstract:

Currently, the continuous two-phase decanter process used for olive oil production is the more internationally widespread. The wastewaters generated from this industry (OMW) is a real environmental problem because of its high organic load. Among proposed treatments for these wastewaters, the advanced oxidation technologies (Fenton process, ozone, photoFenton, etc.) are the most favourable. The direct application of these processes is somewhat expensive. Therefore, the application of a previous stage based on a flocculation-sedimentation operation is of high importance. In this research five commercial flocculants (three cationic, and two anionic) have been used to achieve the separation of phases (liquid clarified-sludge). For each flocculant, different concentrations (0-1000 mg/L) have been studied. In these experiments, sludge volume formed over time and the final water quality were determined. The final removal percentages of total phenols (11.3-25.1%), COD (5.6-20.4%), total carbon (2.3-26.5%), total organic carbon (1.50-23.8%), total nitrogen (1.45-24.8%), and turbidity (27.9-61.4%) were obtained. Also, the variation on the electric conductivity reduction percentage (1-8%) was determined. Finally, the best flocculants with highest removal percentages have been determined (QG2001 and Flocudex CS49).

Keywords: flocculants, flocculation, olive oil mill wastewater, water quality

Procedia PDF Downloads 540
12617 Historical Development of Negative Emotive Intensifiers in Hungarian

Authors: Martina Katalin Szabó, Bernadett Lipóczi, Csenge Guba, István Uveges

Abstract:

In this study, an exhaustive analysis was carried out about the historical development of negative emotive intensifiers in the Hungarian language via NLP methods. Intensifiers are linguistic elements which modify or reinforce a variable character in the lexical unit they apply to. Therefore, intensifiers appear with other lexical items, such as adverbs, adjectives, verbs, infrequently with nouns. Due to the complexity of this phenomenon (set of sociolinguistic, semantic, and historical aspects), there are many lexical items which can operate as intensifiers. The group of intensifiers are admittedly one of the most rapidly changing elements in the language. From a linguistic point of view, particularly interesting are a special group of intensifiers, the so-called negative emotive intensifiers, that, on their own, without context, have semantic content that can be associated with negative emotion, but in particular cases, they may function as intensifiers (e.g.borzasztóanjó ’awfully good’, which means ’excellent’). Despite their special semantic features, negative emotive intensifiers are scarcely examined in literature based on large Historical corpora via NLP methods. In order to become better acquainted with trends over time concerning the intensifiers, The exhaustively analysed a specific historical corpus, namely the Magyar TörténetiSzövegtár (Hungarian Historical Corpus). This corpus (containing 3 millions text words) is a collection of texts of various genres and styles, produced between 1772 and 2010. Since the corpus consists of raw texts and does not contain any additional information about the language features of the data (such as stemming or morphological analysis), a large amount of manual work was required to process the data. Thus, based on a lexicon of negative emotive intensifiers compiled in a previous phase of the research, every occurrence of each intensifier was queried, and the results were stored in a separate data frame. Then, basic linguistic processing (POS-tagging, lemmatization etc.) was carried out automatically with the ‘magyarlanc’ NLP-toolkit. Finally, the frequency and collocation features of all the negative emotive words were automatically analyzed in the corpus. Outcomes of the research revealed in detail how these words have proceeded through grammaticalization over time, i.e., they change from lexical elements to grammatical ones, and they slowly go through a delexicalization process (their negative content diminishes over time). What is more, it was also pointed out which negative emotive intensifiers are at the same stage in this process in the same time period. Giving a closer look to the different domains of the analysed corpus, it also became certain that during this process, the pragmatic role’s importance increases: the newer use expresses the speaker's subjective, evaluative opinion at a certain level.

Keywords: historical corpus analysis, historical linguistics, negative emotive intensifiers, semantic changes over time

Procedia PDF Downloads 233
12616 Analysing Modern City Heritage through Modernization Transformation: A Case of Wuhan, China

Authors: Ziwei Guo, Liangping Hong, Zhiguo Ye

Abstract:

The exogenous modernization process in China and other late-coming countries, is not resulted from a gradual growth of their own modernity features, but a conscious response to external challenges. Under this context, it had been equally important for Chinese cities to make themselves ‘Chinese’ as well as ‘modern’. Wuhan was the first opened inland treaty port in late Qing Dynasty. In the following one hundred years, Wuhan transferred from a feudal town to a modern industrial city. It is a good example to illustrate the urban construction and cultural heritage through the process and impact of social transformation. An overall perspective on transformation will contribute to develop the city`s uniqueness and enhance its inclusive development. The study chooses the history of Wuhan from 1861 to 1957 as the study period. The whole transformation process will be divided into four typical periods based on key historical events, and the paper analyzes the changes on urban structure and constructions activities in each period. Then, a lot of examples are used to compare the features of Wuhan modern city heritage in the four periods. In this way, three characteristics of Wuhan modern city heritage are summarized. The paper finds that globalization and localization worked together to shape the urban physical space environment. For Wuhan, social transformation has a profound and comprehensive impact on urban construction, which can be analyzed in the aspects of main construction, architecture style, location and actors. Moreover, the three towns of Wuhan have a disparate cityscape that is reflected by the varied heritages and architecture features over different transformation periods. Lastly, the protection regulations and conservation planning of heritage in Wuhan are discussed, and suggestions about the conservation of Wuhan modern heritage are tried to be drawn. The implications of the study are providing a new perspective on modern city heritage for cities like Wuhan, and the future local planning system and heritage conservation policies can take into consideration the ‘Modern Cultural Transformation Route’ in this paper.

Keywords: modern city heritage, transformation, identity, Wuhan

Procedia PDF Downloads 131
12615 Concepts in the Design of Lateral-Load Systems in High Rise Buildings to Reduce Operational Energy Consumption

Authors: Mohamed Ali MiladKrem Salem, Sergio F.Breña, Sanjay R. Arwade, Simi T. Hoque

Abstract:

The location of the main lateral‐load resisting system in high-rise buildings may have positive impacts on sustainability through a reduction in operational energy consumption, and this paper describes an assessment of the accompanying effects on structural performance. It is found that there is a strong influence of design for environmental performance on the structural performance the building, and that systems selected primarily with an eye towards energy use reduction may require substantial additional structural stiffening to meet safety and serviceability limits under lateral load cases. We present a framework for incorporating the environmental costs of meeting structural design requirements through the embodied energy of the core structural materials and also address the issue of economic cost brought on by incorporation of environmental concerns into the selection of the structural system. We address these issues through four case study high-rise buildings with differing structural morphologies (floor plan and core arrangement) and assess each of these building models for cost and embodied energy when the base structural system, which has been suggested by architect Kenneth Yeang based on environmental concerns, is augmented to meet lateral drift requirements under the wind loads prescribed by ASCE 7-10.

Keywords: sustainable, embodied, Outrigger, skyscraper, morphology, efficiency

Procedia PDF Downloads 475
12614 Comparative Study of Seismic Isolation as Retrofit Method for Historical Constructions

Authors: Carlos H. Cuadra

Abstract:

Seismic isolation can be used as a retrofit method for historical buildings with the advantage that minimum intervention on super-structure is required. However, selection of isolation devices depends on weight and stiffness of upper structure. In this study, two buildings are considered for analyses to evaluate the applicability of this retrofitting methodology. Both buildings are located at Akita prefecture in the north part of Japan. One building is a wooden structure that corresponds to the old council meeting hall of Noshiro city. The second building is a brick masonry structure that was used as house of a foreign mining engineer and it is located at Ani town. Ambient vibration measurements were performed on both buildings to estimate their dynamic characteristics. Then, target period of vibration of isolated systems is selected as 3 seconds is selected to estimate required stiffness of isolation devices. For wooden structure, which is a light construction, it was found that natural rubber isolators in combination with friction bearings are suitable for seismic isolation. In case of masonry building elastomeric isolator can be used for its seismic isolation. Lumped mass systems are used for seismic response analysis and it is verified in both cases that seismic isolation can be used as retrofitting method of historical construction. However, in the case of the light building, most of the weight corresponds to the reinforced concrete slab that is required to install isolation devices.

Keywords: historical building, finite element method, masonry structure, seismic isolation, wooden structure

Procedia PDF Downloads 156
12613 Treatment of Low-Grade Iron Ore Using Two Stage Wet High-Intensity Magnetic Separation Technique

Authors: Moses C. Siame, Kazutoshi Haga, Atsushi Shibayama

Abstract:

This study investigates the removal of silica, alumina and phosphorus as impurities from Sanje iron ore using wet high-intensity magnetic separation (WHIMS). Sanje iron ore contains low-grade hematite ore found in Nampundwe area of Zambia from which iron is to be used as the feed in the steelmaking process. The chemical composition analysis using X-ray Florence spectrometer showed that Sanje low-grade ore contains 48.90 mass% of hematite (Fe2O3) with 34.18 mass% as an iron grade. The ore also contains silica (SiO2) and alumina (Al2O3) of 31.10 mass% and 7.65 mass% respectively. The mineralogical analysis using X-ray diffraction spectrometer showed hematite and silica as the major mineral components of the ore while magnetite and alumina exist as minor mineral components. Mineral particle distribution analysis was done using scanning electron microscope with an X-ray energy dispersion spectrometry (SEM-EDS) and images showed that the average mineral size distribution of alumina-silicate gangue particles is in order of 100 μm and exists as iron-bearing interlocked particles. Magnetic separation was done using series L model 4 Magnetic Separator. The effect of various magnetic separation parameters such as magnetic flux density, particle size, and pulp density of the feed was studied during magnetic separation experiments. The ore with average particle size of 25 µm and pulp density of 2.5% was concentrated using pulp flow of 7 L/min. The results showed that 10 T was optimal magnetic flux density which enhanced the recovery of 93.08% of iron with 53.22 mass% grade. The gangue mineral particles containing 12 mass% silica and 3.94 mass% alumna remained in the concentrate, therefore the concentrate was further treated in the second stage WHIMS using the same parameters from the first stage. The second stage process recovered 83.41% of iron with 67.07 mass% grade. Silica was reduced to 2.14 mass% and alumina to 1.30 mass%. Accordingly, phosphorus was also reduced to 0.02 mass%. Therefore, the two stage magnetic separation process was established using these results.

Keywords: Sanje iron ore, magnetic separation, silica, alumina, recovery

Procedia PDF Downloads 258
12612 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.

Keywords: apartment complex, big data, life-cycle building value analysis, machine learning

Procedia PDF Downloads 374
12611 Rational Allocation of Resources in Water Infrastructure Development Projects

Authors: M. Macchiaroli, V. Pellecchia, L. Dolores

Abstract:

Within any European and world model of management of the integrated water service (in Italy only since 2012 is regulated by a national Authority, that is ARERA), a significant part is covered by the development of assets in terms of hydraulic networks and wastewater collection networks, including all their relative building works. The process of selecting the investments to be made starts from the preventive analysis of critical issues (water losses, unserved areas, low service standards, etc.) who occur in the managed territory of the Operator. Through the Program of Interventions (Provision by ARERA n. 580/2019/R/idr), the Operator provides to program the projects that can meet the emerged needs to determine the improvement of the water service levels. This phase (analyzed and solved by the author with a work published in 2019) involves the use of evaluation techniques (cost-benefit analysis, multi-criteria, and multi-objective techniques, neural networks, etc.) useful in selecting the most appropriate design answers to the different criticalities. However, at this point, the problem of establishing the time priorities between the various works deemed necessary remains open. That is, it is necessary to hierarchize the investments. In this decision-making moment, the interests of the private Operator are often opposed, which favors investments capable of generating high profitability, compared to those of the public controller (ARERA), which favors investments in greater social impact. In support of the concertation between these two actors, the protocol set out in the research has been developed, based on the AHP and capable of borrowing from the programmatic documents an orientation path for the settlement of the conflict. The protocol is applied to a case study of the Campania Region in Italy and has been professionally applied in the shared decision process between the manager and the local Authority.

Keywords: analytic hierarchy process, decision making, economic evaluation of projects, integrated water service

Procedia PDF Downloads 124
12610 On the Use of Reliability Factors to Reduce Conflict between Information Sources in Dempster-Shafer Theory

Authors: A. Alem, Y. Dahmani, A. Hadjali, A. Boualem

Abstract:

Managing the problem of the conflict, either by using the Dempster-Shafer theory, or by the application of the fusion process to push researchers in recent years to find ways to get to make best decisions especially; for information systems, vision, robotic and wireless sensor networks. In this paper we are interested to take account of the conflict in the combination step that took the conflict into account and tries to manage such a way that it does not influence the decision step, the conflict what from reliable sources. According to [1], the conflict lead to erroneous decisions in cases where was with strong degrees between sources of information, if the conflict is more than the maximum of the functions of belief mass K > max1...n (mi (A)), then the decision becomes impossible. We will demonstrate in this paper that the multiplication of mass functions by coefficients of reliability is a decreasing function; it leads to the reduction of conflict and a good decision. The definition of reliability coefficients accurately and multiply them by the mass functions of each information source to resolve the conflict and allow deciding whether the degree of conflict. The evaluation of this technique is done by a use case; a comparison of the combination of springs with a maximum conflict without, and with reliability coefficients.

Keywords: Dempster-Shafer theory, fusion process, conflict managing, reliability factors, decision

Procedia PDF Downloads 426
12609 Design an Development of an Agorithm for Prioritizing the Test Cases Using Neural Network as Classifier

Authors: Amit Verma, Simranjeet Kaur, Sandeep Kaur

Abstract:

Test Case Prioritization (TCP) has gained wide spread acceptance as it often results in good quality software free from defects. Due to the increase in rate of faults in software traditional techniques for prioritization results in increased cost and time. Main challenge in TCP is difficulty in manually validate the priorities of different test cases due to large size of test suites and no more emphasis are made to make the TCP process automate. The objective of this paper is to detect the priorities of different test cases using an artificial neural network which helps to predict the correct priorities with the help of back propagation algorithm. In our proposed work one such method is implemented in which priorities are assigned to different test cases based on their frequency. After assigning the priorities ANN predicts whether correct priority is assigned to every test case or not otherwise it generates the interrupt when wrong priority is assigned. In order to classify the different priority test cases classifiers are used. Proposed algorithm is very effective as it reduces the complexity with robust efficiency and makes the process automated to prioritize the test cases.

Keywords: test case prioritization, classification, artificial neural networks, TF-IDF

Procedia PDF Downloads 397
12608 A Bibliometric Analysis on Filter Bubble

Authors: Misbah Fatma, Anam Saiyeda

Abstract:

This analysis charts the introduction and expansion of research into the filter bubble phenomena over the last 10 years using a large dataset of academic publications. This bibliometric study demonstrates how interdisciplinary filter bubble research is. The identification of key authors and organizations leading the filter bubble study sheds information on collaborative networks and knowledge transfer. Relevant papers are organized based on themes including algorithmic bias, polarisation, social media, and ethical implications through a systematic examination of the literature. In order to shed light on how these patterns have changed over time, the study plots their historical history. The study also looks at how research is distributed globally, showing geographic patterns and discrepancies in scholarly output. The results of this bibliometric analysis let us fully comprehend the development and reach of filter bubble research. This study offers insights into the ongoing discussion surrounding information personalization and its implications for societal discourse, democratic participation, and the potential risks to an informed citizenry by exposing dominant themes, interdisciplinary collaborations, and geographic patterns. In order to solve the problems caused by filter bubbles and to advance a more diverse and inclusive information environment, this analysis is essential for scholars and researchers.

Keywords: bibliometric analysis, social media, social networking, algorithmic personalization, self-selection, content moderation policies and limited access to information, recommender system and polarization

Procedia PDF Downloads 118
12607 Understanding Team Member Autonomy and Team Collaboration: A Qualitative Study

Authors: Ayşen Bakioğlu, Gökçen Seyra Çakır

Abstract:

This study aims to explore how research assistants who work in project teams experience team member autonomy and how they reconcile team member autonomy with team collaboration. The study utilizes snowball sampling. 20 research assistants who work the faculties of education in Marmara University and Yıldız Technical University have been interviewed. The analysis of data involves a content analysis MAXQDAPlus 11 which is a qualitative data analysis software is used as the data analysis tool. According to the findings of this study, emerging themes include team norm formation, team coordination management, the role of individual tasks in team collaboration, leadership distribution. According to the findings, interviewees experience team norm formation process in terms of processes, which pertain to task fulfillment, and processes, which pertain to the regulation of team dynamics. Team norm formation process instills a sense of responsibility amongst individual team members. Apart from that, the interviewees’ responses indicate that the realization of the obligation to work in a team contributes to the team norm formation process. The participants indicate that individual expectations are taken into consideration during the coordination of the team. The supervisor of the project team also has a crucial role in maintaining team collaboration. Coordination problems arise when an individual team member does not relate his/her academic field with the research topic of the project team. The findings indicate that the leadership distribution in the project teams involves two leadership processes: leadership distribution which is based on the processes that focus on individual team members and leadership distribution which is based on the processes that focus on team interaction. Apart from that, individual tasks serve as a facilitator of collaboration amongst team members. Interviewees also indicate that individual tasks also facilitate the expression of individuality.

Keywords: project teams in higher education, research assistant teams, team collaboration, team member autonomy

Procedia PDF Downloads 362
12606 The Asymptotic Hole Shape in Long Pulse Laser Drilling: The Influence of Multiple Reflections

Authors: Torsten Hermanns, You Wang, Stefan Janssen, Markus Niessen, Christoph Schoeler, Ulrich Thombansen, Wolfgang Schulz

Abstract:

In long pulse laser drilling of metals, it can be demonstrated that the ablation shape approaches a so-called asymptotic shape such that it changes only slightly or not at all with further irradiation. These findings are already known from ultra short pulse (USP) ablation of dielectric and semiconducting materials. The explanation for the occurrence of an asymptotic shape in long pulse drilling of metals is identified, a model for the description of the asymptotic hole shape numerically implemented, tested and clearly confirmed by comparison with experimental data. The model assumes a robust process in that way that the characteristics of the melt flow inside the arising melt film does not change qualitatively by changing the laser or processing parameters. Only robust processes are technically controllable and thus of industrial interest. The condition for a robust process is identified by a threshold for the mass flow density of the assist gas at the hole entrance which has to be exceeded. Within a robust process regime the melt flow characteristics can be captured by only one model parameter, namely the intensity threshold. In analogy to USP ablation (where it is already known for a long time that the resulting hole shape results from a threshold for the absorbed laser fluency) it is demonstrated that in the case of robust long pulse ablation the asymptotic shape forms in that way that along the whole contour the absorbed heat flux density is equal to the intensity threshold. The intensity threshold depends on the special material and radiation properties and has to be calibrated be one reference experiment. The model is implemented in a numerical simulation which is called AsymptoticDrill and requires such a few amount of resources that it can run on common desktop PCs, laptops or even smart devices. Resulting hole shapes can be calculated within seconds what depicts a clear advantage over other simulations presented in literature in the context of industrial every day usage. Against this background the software additionally is equipped with a user-friendly GUI which allows an intuitive usage. Individual parameters can be adjusted using sliders while the simulation result appears immediately in an adjacent window. A platform independent development allow a flexible usage: the operator can use the tool to adjust the process in a very convenient manner on a tablet during the developer can execute the tool in his office in order to design new processes. Furthermore, at the best knowledge of the authors AsymptoticDrill is the first simulation which allows the import of measured real beam distributions and thus calculates the asymptotic hole shape on the basis of the real state of the specific manufacturing system. In this paper the emphasis is placed on the investigation of the effect of multiple reflections on the asymptotic hole shape which gain in importance when drilling holes with large aspect ratios.

Keywords: asymptotic hole shape, intensity threshold, long pulse laser drilling, robust process

Procedia PDF Downloads 213
12605 Early Recognition and Grading of Cataract Using a Combined Log Gabor/Discrete Wavelet Transform with ANN and SVM

Authors: Hadeer R. M. Tawfik, Rania A. K. Birry, Amani A. Saad

Abstract:

Eyes are considered to be the most sensitive and important organ for human being. Thus, any eye disorder will affect the patient in all aspects of life. Cataract is one of those eye disorders that lead to blindness if not treated correctly and quickly. This paper demonstrates a model for automatic detection, classification, and grading of cataracts based on image processing techniques and artificial intelligence. The proposed system is developed to ease the cataract diagnosis process for both ophthalmologists and patients. The wavelet transform combined with 2D Log Gabor Wavelet transform was used as feature extraction techniques for a dataset of 120 eye images followed by a classification process that classified the image set into three classes; normal, early, and advanced stage. A comparison between the two used classifiers, the support vector machine SVM and the artificial neural network ANN were done for the same dataset of 120 eye images. It was concluded that SVM gave better results than ANN. SVM success rate result was 96.8% accuracy where ANN success rate result was 92.3% accuracy.

Keywords: cataract, classification, detection, feature extraction, grading, log-gabor, neural networks, support vector machines, wavelet

Procedia PDF Downloads 332
12604 Mango (Mangifera indica L.) Lyophilization Using Vacuum-Induced Freezing

Authors: Natalia A. Salazar, Erika K. Méndez, Catalina Álvarez, Carlos E. Orrego

Abstract:

Lyophilization, also called freeze-drying, is an important dehydration technique mainly used for pharmaceuticals. Food industry also uses lyophilization when it is important to retain most of the nutritional quality, taste, shape and size of dried products and to extend their shelf life. Vacuum-Induced during freezing cycle (VI) has been used in order to control ice nucleation and, consequently, to reduce the time of primary drying cycle of pharmaceuticals preserving quality properties of the final product. This procedure has not been applied in freeze drying of foods. The present work aims to investigate the effect of VI on the lyophilization drying time, final moisture content, density and reconstitutional properties of mango (Mangifera indica L.) slices (MS) and mango pulp-maltodextrin dispersions (MPM) (30% concentration of total solids). Control samples were run at each freezing rate without using induced vacuum. The lyophilization endpoint was the same for all treatments (constant difference between capacitance and Pirani vacuum gauges). From the experimental results it can be concluded that at the high freezing rate (0.4°C/min) reduced the overall process time up to 30% comparing process time required for the control and VI of the lower freeze rate (0.1°C/min) without affecting the quality characteristics of the dried product, which yields a reduction in costs and energy consumption for MS and MPM freeze drying. Controls and samples treated with VI at freezing rate of 0.4°C/min in MS showed similar results in moisture and density parameters. Furthermore, results from MPM dispersion showed favorable values when VI was applied because dried product with low moisture content and low density was obtained at shorter process time compared with the control. There were not found significant differences between reconstitutional properties (rehydration for MS and solubility for MPM) of freeze dried mango resulting from controls, and VI treatments.

Keywords: drying time, lyophilization, mango, vacuum induced freezing

Procedia PDF Downloads 410
12603 Study and Improvement of the Quality of a Production Line

Authors: S. Bouchami, M.N. Lakhoua

Abstract:

The automotive market is a dynamic market that continues to grow. That’s why several companies belonging to this sector adopt a quality improvement approach. Wanting to be competitive and successful in the environment in which they operate, these companies are dedicated to establishing a system of quality management to ensure the achievement of the objective quality, improving the products and process as well as the satisfaction of the customers. In this paper, the management of the quality and the improvement of a production line in an industrial company is presented. In fact, the project is divided into two essential parts: the creation of the technical line documentation and the quality assurance documentation and the resolution of defects at the line, as well as those claimed by the customer. The creation of the documents has required a deep understanding of the manufacturing process. The analysis and problem solving were done through the implementation of PDCA (Plan Do Check Act) and FTA (Fault Tree Analysis). As perspective, in order to better optimize production and improve the efficiency of the production line, a study on the problems associated with the supply of raw materials should be made to solve the problems of stock-outs which cause delays penalizing for the industrial company.

Keywords: quality management, documentary system, Plan Do Check Act (PDCA), fault tree analysis (FTA) method

Procedia PDF Downloads 142
12602 “Octopub”: Geographical Sentiment Analysis Using Named Entity Recognition from Social Networks for Geo-Targeted Billboard Advertising

Authors: Oussama Hafferssas, Hiba Benyahia, Amina Madani, Nassima Zeriri

Abstract:

Although data nowadays has multiple forms; from text to images, and from audio to videos, yet text is still the most used one at a public level. At an academical and research level, and unlike other forms, text can be considered as the easiest form to process. Therefore, a brunch of Data Mining researches has been always under its shadow, called "Text Mining". Its concept is just like data mining’s, finding valuable patterns in data, from large collections and tremendous volumes of data, in this case: Text. Named entity recognition (NER) is one of Text Mining’s disciplines, it aims to extract and classify references such as proper names, locations, expressions of time and dates, organizations and more in a given text. Our approach "Octopub" does not aim to find new ways to improve named entity recognition process, rather than that it’s about finding a new, and yet smart way, to use NER in a way that we can extract sentiments of millions of people using Social Networks as a limitless information source, and Marketing for product promotion as the main domain of application.

Keywords: textmining, named entity recognition(NER), sentiment analysis, social media networks (SN, SMN), business intelligence(BI), marketing

Procedia PDF Downloads 589
12601 Finite Element Analysis and Design Optimization of Stent and Balloon System

Authors: V. Hashim, P. N. Dileep

Abstract:

Stent implantation is being seen as the most successful method to treat coronary artery diseases. Different types of stents are available in the market these days and the success of a stent implantation greatly depends on the proper selection of a suitable stent for a patient. Computer numerical simulation is the cost effective way to choose the compatible stent. Studies confirm that the design characteristics of stent do have great importance with regards to the pressure it can sustain, the maximum displacement it can produce, the developed stress concentration and so on. In this paper different designs of stent were analyzed together with balloon to optimize the stent and balloon system. Commercially available stent Palmaz-Schatz has been selected for analysis. Abaqus software is used to simulate the system. This work is the finite element analysis of the artery stent implant to find out the design factors affecting the stress and strain. The work consists of two phases. In the first phase, stress distribution of three models were compared - stent without balloon, stent with balloon of equal length and stent with balloon of extra length than stent. In second phase, three different design models of Palmaz-Schatz stent were compared by keeping the balloon length constant. The results obtained from analysis shows that, the design of the strut have strong effect on the stress distribution. A design with chamfered slots found better results. The length of the balloon also has influence on stress concentration of the stent. Increase in length of the balloon will reduce stress, but will increase dog boning effect.

Keywords: coronary stent, finite element analysis, restenosis, stress concentration

Procedia PDF Downloads 623
12600 The Study of Thai Consumer Behavior toward Buying Goods on the Internet

Authors: Pichamon Chansuchai

Abstract:

The study of Thai consumer behavior toward buying goods on the Internet is a survey research. The five-level rating scale and open-ended questionnaire are applied for this research procedure, which has more than 400 random sampling of Thai people aged between 15-40 years old. The summary findings are: The analysis of respondents profile were female 55.3% and male 44.8% , 35.3% aged between 20-30 years old, had been employed 29.5% with average income up to 11,000 baht/month 50.2% and expenditure more than 11,000 baht per month 29.3%. The internet usage behavior of respondents mostly found that objectives of the internet usage are: 1) Communication 93.3% 2) the categories of websites usage was trading 42.8% 3) The marketing mix effected to trading behavior via internet which can be analyzed in term of marketing factor as following: Product focused on product quality was the most influenced factor with average value 4.75. The cheaper price than overview market was the most effect factor to internet shopping with mean value 4.53. The average value 4.67 of the available place that could reduce spending time for shopping. The effective promotion of the buy 1 get 1 was the stimulus factor for internet shopping with mean value 4.60. For hypothesis testing, the different sex has relationship with buying decision. It presented that male and female have vary purchasing decision via internet with value of significant difference 0.05. Furthermore, the variety occupations of respondents related to the use of selected type of website. It also found that the vary of personal occupation effected to the type of website selection dissimilar with value of significant difference 0.05.

Keywords: behavior, internet, consumer, goods

Procedia PDF Downloads 249
12599 Evaluation of Nitrogen Fixation Capabilities of Selected Pea Lines Grown under Different Environmental Conditions in Canadian Prairie

Authors: Chao Yang, Rosalind Bueckert, Jeff Schoenau, Axel Diederichsen, Hossein Zakeri, Tom Warkentin

Abstract:

Pea is a very popular pulse crop that widely grew in Western Canadian prairie. However, the N fixation capabilities of these pea lines were not well evaluated under local environmental conditions. In this study, 2 supernodulating mutants Frisson P64 Sym29, Frisson P88 Sym28 along with their wild parent Frisson, 1 hypernodulating mutant Rondo-nod3 (fix+) along with its wild parent Rondo, 1 non-nodulating mutant Frisson P56 (nod-) and 2 commercial pea cultivar CDC Meadow and CDC Dakota which are widely planted in Western Canada were selected in order to evaluate the capabilities of their BNF, biomass, and yield production in symbiosis with R. leguminosarumbv. viciae, Our results showed different environmental conditions and variation of pea lines could both significantly impact days to flowering (DTF), days to podding (DTP), biomass and yield of tested pea lines (P < 0.0001), suggesting consideration of environmental factors could be important when selecting pea cultivar for local farming under different soil zones in Western Canada. Significant interaction effects between environmental conditions and pea lines were found on pea N fixation as well (P = 0.001), suggesting changes in N fixation capability of the same pea cultivar when grown under different environmental conditions. Our results provide useful information for farming and better opportunity for selection of pea cultivars with higher N-fixing capacity during breeding programs in Western Canada.

Keywords: Canadian prairie, environmental condition, N fixation, pea cultivar

Procedia PDF Downloads 344
12598 Biochar as a Strong Adsorbent for Multiple-Metal Removal from Contaminated Water

Authors: Eman H. El-Gamal, Mai E. Khedr, Randa Ghonim, Mohamed Rashad

Abstract:

In the past few years, biochar - a highly carbon-rich material produced from agro-wastes by pyrolysis process - was used as an effective adsorbent for heavy metals removal from polluted water. In this study, different types of biochar (rice straw 'RSB', corn cob 'CCB', and Jatropha shell 'JSB' were used to evaluate the adsorption capacity of heavy metals removal from multiple-metal solutions (Cu, Mn, Zn, and Cd). Kinetics modeling has been examined to illustrate potential adsorption mechanisms. The results showed that the potential removal of metal is dependent on the metal and biochar types. The adsorption capacity of the biochars followed the order: RSB > JSB > CCB. In general, RSB and JSB biochars presented high potential removal of heavy metals from polluted water, which was higher than 90 and 80% after 2 hrs of contact time for all metals, respectively. According to the kinetics data, the pseudo-second-order model was agreed strongly with Cu, Mn, Zn, and Cd adsorption onto the biochars (R2 ≥ 0.97), indicating the dominance of specific adsorption process, i.e., chemisorption. In conclusion, this study revealed that RSB and JSB biochar have the potential to be a strong adsorbent for multiple-metal removal from wastewater.

Keywords: adsorption, biochar, chemisorption, polluted water

Procedia PDF Downloads 150
12597 Design, Analysis and Optimization of Space Frame for BAJA SAE Chassis

Authors: Manoj Malviya, Shubham Shinde

Abstract:

The present study focuses on the determination of torsional stiffness of a space frame chassis and comparison of elements used in the Finite Element Analysis of frame. The study also discusses various concepts and design aspects of a space frame chassis with the emphasis on their applicability in BAJA SAE vehicles. Torsional stiffness is a very important factor that determines the chassis strength, vehicle control, and handling. Therefore, it is very important to determine the torsional stiffness of the vehicle before designing an optimum chassis so that it should not fail during extreme conditions. This study determines the torsional stiffness of frame with respect to suspension shocks, roll-stiffness and anti-roll bar rates. A spring model is developed to study the effects of suspension parameters. The engine greatly contributes to torsional stiffness, and therefore, its effects on torsional stiffness need to be considered. Deflections in the tire have not been considered in the present study. The proper element shape should be selected to analyze the effects of various loadings on chassis while implementing finite element methods. The study compares the accuracy of results and computational time for different element types. Shape functions of these elements are also discussed. Modelling methodology is discussed for the multibody analysis of chassis integrated with suspension arms and engine. Proper boundary conditions are presented so as to replicate the real life conditions.

Keywords: space frame chassis, torsional stiffness, multi-body analysis of chassis, element selection

Procedia PDF Downloads 354
12596 From Binary Solutions to Real Bio-Oils: A Multi-Step Extraction Story of Phenolic Compounds with Ionic Liquid

Authors: L. Cesari, L. Canabady-Rochelle, F. Mutelet

Abstract:

The thermal conversion of lignin produces bio-oils that contain many compounds with high added-value such as phenolic compounds. In order to efficiently extract these compounds, the possible use of choline bis(trifluoromethylsulfonyl)imide [Choline][NTf2] ionic liquid was explored. To this end, a multistep approach was implemented. First, binary (phenolic compound and solvent) and ternary (phenolic compound and solvent and ionic liquid) solutions were investigated. Eight binary systems of phenolic compound and water were investigated at atmospheric pressure. These systems were quantified using the turbidity method and UV-spectroscopy. Ternary systems (phenolic compound and water and [Choline][NTf2]) were investigated at room temperature and atmospheric pressure. After stirring, the solutions were let to settle down, and a sample of each phase was collected. The analysis of the phases was performed using gas chromatography with an internal standard. These results were used to quantify the values of the interaction parameters of thermodynamic models. Then, extractions were performed on synthetic solutions to determine the influence of several operating conditions (temperature, kinetics, amount of [Choline][NTf2]). With this knowledge, it has been possible to design and simulate an extraction process composed of one extraction column and one flash. Finally, the extraction efficiency of [Choline][NTf2] was quantified with real bio-oils from lignin pyrolysis. Qualitative and quantitative analysis were performed using gas chromatographic connected to mass spectroscopy and flame ionization detector. The experimental measurements show that the extraction of phenolic compounds is efficient at room temperature, quick and does not require a high amount of [Choline][NTf2]. Moreover, the simulations of the extraction process demonstrate that [Choline][NTf2] process requires less energy than an organic one. Finally, the efficiency of [Choline][NTf2] was confirmed in real situations with the experiments on lignin pyrolysis bio-oils.

Keywords: bio-oils, extraction, lignin, phenolic compounds

Procedia PDF Downloads 110
12595 Bayes Estimation of Parameters of Binomial Type Rayleigh Class Software Reliability Growth Model using Non-informative Priors

Authors: Rajesh Singh, Kailash Kale

Abstract:

In this paper, the Binomial process type occurrence of software failures is considered and failure intensity has been characterized by one parameter Rayleigh class Software Reliability Growth Model (SRGM). The proposed SRGM is mathematical function of parameters namely; total number of failures i.e. η-0 and scale parameter i.e. η-1. It is assumed that very little or no information is available about both these parameters and then considering non-informative priors for both these parameters, the Bayes estimators for the parameters η-0 and η-1 have been obtained under square error loss function. The proposed Bayes estimators are compared with their corresponding maximum likelihood estimators on the basis of risk efficiencies obtained by Monte Carlo simulation technique. It is concluded that both the proposed Bayes estimators of total number of failures and scale parameter perform well for proper choice of execution time.

Keywords: binomial process, non-informative prior, maximum likelihood estimator (MLE), rayleigh class, software reliability growth model (SRGM)

Procedia PDF Downloads 389
12594 The Use of a Miniature Bioreactor as Research Tool for Biotechnology Process Development

Authors: Muhammad Zainuddin Arriafdi, Hamudah Hakimah Abdullah, Mohd Helmi Sani, Wan Azlina Ahmad, Muhd Nazrul Hisham Zainal Alam

Abstract:

The biotechnology process development demands numerous experimental works. In laboratory environment, this is typically carried out using a shake flask platform. This paper presents the design and fabrication of a miniature bioreactor system as an alternative research tool for bioprocessing. The working volume of the reactor is 100 ml, and it is made of plastic. The main features of the reactor included stirring control, temperature control via the electrical heater, aeration strategy through a miniature air compressor, and online optical cell density (OD) sensing. All sensors and actuators integrated into the reactor was controlled using an Arduino microcontroller platform. In order to demonstrate the functionality of such miniature bioreactor concept, series of batch Saccharomyces cerevisiae fermentation experiments were performed under various glucose concentrations. Results attained from the fermentation experiments were utilized to solve the Monod equation constants, namely the saturation constant, Ks, and cells maximum growth rate, μmax as to further highlight the usefulness of the device. The mixing capacity of the reactor was also evaluated. It was found that the results attained from the miniature bioreactor prototype were comparable to results achieved using a shake flask. The unique features of the device as compared to shake flask platform is that the reactor mixing condition is much more comparable to a lab-scale bioreactor setup. The prototype is also integrated with an online OD sensor, and as such, no sampling was needed to monitor the progress of the reaction performed. Operating cost and medium consumption are also low and thus, making it much more economical to be utilized for biotechnology process development compared to lab-scale bioreactors.

Keywords: biotechnology, miniature bioreactor, research tools, Saccharomyces cerevisiae

Procedia PDF Downloads 117
12593 Classification of Random Doppler-Radar Targets during the Surveillance Operations

Authors: G. C. Tikkiwal, Mukesh Upadhyay

Abstract:

During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving the army, moving convoys etc. The radar operator selects one of the promising targets into single target tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper, we present a technique using mathematical and statistical methods like fast fourier transformation (FFT) and principal component analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.

Keywords: radar target, FFT, principal component analysis, eigenvector, octave-notes, DSP

Procedia PDF Downloads 394
12592 Effect of Phonological Complexity in Children with Specific Language Impairment

Authors: Irfana M., Priyandi Kabasi

Abstract:

Children with specific language impairment (SLI) have difficulty acquiring and using language despite having all the requirements of cognitive skills to support language acquisition. These children have normal non-verbal intelligence, hearing, and oral-motor skills, with no history of social/emotional problems or significant neurological impairment. Nevertheless, their language acquisition lags behind their peers. Phonological complexity can be considered to be the major factor that causes the inaccurate production of speech in this population. However, the implementation of various ranges of complex phonological stimuli in the treatment session of SLI should be followed for a better prognosis of speech accuracy. Hence there is a need to study the levels of phonological complexity. The present study consisted of 7 individuals who were diagnosed with SLI and 10 developmentally normal children. All of them were Hindi speakers with both genders and their age ranged from 4 to 5 years. There were 4 sets of stimuli; among them were minimal contrast vs maximal contrast nonwords, minimal coarticulation vs maximal coarticulation nonwords, minimal contrast vs maximal contrast words and minimal coarticulation vs maximal coarticulation words. Each set contained 10 stimuli and participants were asked to repeat each stimulus. Results showed that production of maximal contrast was significantly accurate, followed by minimal coarticulation, minimal contrast and maximal coarticulation. A similar trend was shown for both word and non-word categories of stimuli. The phonological complexity effect was evident in the study for each participant group. Moreover, present study findings can be implemented for the management of SLI, specifically for the selection of stimuli.

Keywords: coarticulation, minimal contrast, phonological complexity, specific language impairment

Procedia PDF Downloads 142
12591 Heat and Mass Transfer Modelling of Industrial Sludge Drying at Different Pressures and Temperatures

Authors: L. Al Ahmad, C. Latrille, D. Hainos, D. Blanc, M. Clausse

Abstract:

A two-dimensional finite volume axisymmetric model is developed to predict the simultaneous heat and mass transfers during the drying of industrial sludge. The simulations were run using COMSOL-Multiphysics 3.5a. The input parameters of the numerical model were acquired from a preliminary experimental work. Results permit to establish correlations describing the evolution of the various parameters as a function of the drying temperature and the sludge water content. The selection and coupling of the equation are validated based on the drying kinetics acquired experimentally at a temperature range of 45-65 °C and absolute pressure range of 200-1000 mbar. The model, incorporating the heat and mass transfer mechanisms at different operating conditions, shows simulated values of temperature and water content. Simulated results are found concordant with the experimental values, only at the first and last drying stages where sludge shrinkage is insignificant. Simulated and experimental results show that sludge drying is favored at high temperatures and low pressure. As experimentally observed, the drying time is reduced by 68% for drying at 65 °C compared to 45 °C under 1 atm. At 65 °C, a 200-mbar absolute pressure vacuum leads to an additional reduction in drying time estimated by 61%. However, the drying rate is underestimated in the intermediate stage. This rate underestimation could be improved in the model by considering the shrinkage phenomena that occurs during sludge drying.

Keywords: industrial sludge drying, heat transfer, mass transfer, mathematical modelling

Procedia PDF Downloads 134
12590 Ergonomics Management and Sustainability: An Exploratory Study Applied to Automaker Industry in South of Brazil

Authors: Giles Balbinotti, Lucas Balbinotti, Paula Hembecker

Abstract:

The management of the productive process project activities, for the conception of future work and for the financial health of the companies, is an important condition in an organizational model that corroborates the management of the human aspects and their variabilities existing in the work. It is important to seek, at all levels of the organization, understanding and consequent cultural change, and so that factors associated with human aspects are considered and prioritized in the projects. In this scenario, the central question of research for this study is placed from the context of the work, in which the managers and project coordinators are inserted, as follows: How is the top management convinced, in the design stages, to take The ‘Ergonomics’ as strategy for the performance and sustainability of the business? In this perspective, this research has as general objective to analyze how the application of the management of the human aspects in a real project of productive process in the automotive industry, including the activity of the manager and coordinator of the project beyond the strategies of convincing to act in the ergonomics of design. For this, the socio-technical and ergonomic approach is adopted, given its anthropocentric premise in the sense of acting on the social system simultaneously to the technical system, besides the support of the Modapts system that measures the non-value-added times and the correlation with the Critical positions. The methodological approach adopted in this study is based on a review of the literature and the analysis of the activity of the project coordinators of an industry, including the management of human aspects in the context of work variability and the strategies applied in project activities. It was observed in the study that the loss of performance of the serial production lines reaches the important number of the order of 30%, which can make the operation with not value-added, and this loss has as one of the causes, the ergonomic problems present in the professional activity.

Keywords: human aspects in production process project, ergonomics in design, sociotechnical project management, sociotechnical, ergonomic principles, sustainability

Procedia PDF Downloads 251
12589 Association of Selected Polymorphisms of BER Pathway with the Risk of Colorectal Cancer in the Polish Population

Authors: Jacek Kabzinski, Karolina Przybylowska, Lukasz Dziki, Adam Dziki, Ireneusz Majsterek

Abstract:

The incidence of colorectal cancer (CRC) is increasing from year to year. Despite intensive research CRC etiology remains unknown. Studies suggest that at the basis of the process of carcinogenesis can lie reduced efficiency of DNA repair mechanisms, often caused by polymorphisms in DNA repair genes. The aim of the study was to determine the relationship between gene polymorphisms Pro242Arg of PolB gene and Arg780His of Lig3 gene and modulation of the risk of colorectal cancer in the Polish population. Determination of the molecular basis of carcinogenesis process and predicting increased risk will allow qualifying patients to increased risk group and including them in preventive program. We used blood collected from 110 patients diagnosed with colorectal cancer. The control group consisted of equal number of healthy people. Genotyping was performed by TaqMan method. The obtained results indicate that the genotype 780Arg/His of Lig3 gene is associated with an increased risk of colorectal cancer. On the basis of these results, we conclude that Lig3 gene polymorphism Arg780His may be associated with an increased risk of colorectal cancer.

Keywords: BER, colorectal cancer, PolB, Lig3, polymorphisms

Procedia PDF Downloads 454