Search results for: Bologna process
12399 Historical Development of Negative Emotive Intensifiers in Hungarian
Authors: Martina Katalin Szabó, Bernadett Lipóczi, Csenge Guba, István Uveges
Abstract:
In this study, an exhaustive analysis was carried out about the historical development of negative emotive intensifiers in the Hungarian language via NLP methods. Intensifiers are linguistic elements which modify or reinforce a variable character in the lexical unit they apply to. Therefore, intensifiers appear with other lexical items, such as adverbs, adjectives, verbs, infrequently with nouns. Due to the complexity of this phenomenon (set of sociolinguistic, semantic, and historical aspects), there are many lexical items which can operate as intensifiers. The group of intensifiers are admittedly one of the most rapidly changing elements in the language. From a linguistic point of view, particularly interesting are a special group of intensifiers, the so-called negative emotive intensifiers, that, on their own, without context, have semantic content that can be associated with negative emotion, but in particular cases, they may function as intensifiers (e.g.borzasztóanjó ’awfully good’, which means ’excellent’). Despite their special semantic features, negative emotive intensifiers are scarcely examined in literature based on large Historical corpora via NLP methods. In order to become better acquainted with trends over time concerning the intensifiers, The exhaustively analysed a specific historical corpus, namely the Magyar TörténetiSzövegtár (Hungarian Historical Corpus). This corpus (containing 3 millions text words) is a collection of texts of various genres and styles, produced between 1772 and 2010. Since the corpus consists of raw texts and does not contain any additional information about the language features of the data (such as stemming or morphological analysis), a large amount of manual work was required to process the data. Thus, based on a lexicon of negative emotive intensifiers compiled in a previous phase of the research, every occurrence of each intensifier was queried, and the results were stored in a separate data frame. Then, basic linguistic processing (POS-tagging, lemmatization etc.) was carried out automatically with the ‘magyarlanc’ NLP-toolkit. Finally, the frequency and collocation features of all the negative emotive words were automatically analyzed in the corpus. Outcomes of the research revealed in detail how these words have proceeded through grammaticalization over time, i.e., they change from lexical elements to grammatical ones, and they slowly go through a delexicalization process (their negative content diminishes over time). What is more, it was also pointed out which negative emotive intensifiers are at the same stage in this process in the same time period. Giving a closer look to the different domains of the analysed corpus, it also became certain that during this process, the pragmatic role’s importance increases: the newer use expresses the speaker's subjective, evaluative opinion at a certain level.Keywords: historical corpus analysis, historical linguistics, negative emotive intensifiers, semantic changes over time
Procedia PDF Downloads 23312398 Analysing Modern City Heritage through Modernization Transformation: A Case of Wuhan, China
Authors: Ziwei Guo, Liangping Hong, Zhiguo Ye
Abstract:
The exogenous modernization process in China and other late-coming countries, is not resulted from a gradual growth of their own modernity features, but a conscious response to external challenges. Under this context, it had been equally important for Chinese cities to make themselves ‘Chinese’ as well as ‘modern’. Wuhan was the first opened inland treaty port in late Qing Dynasty. In the following one hundred years, Wuhan transferred from a feudal town to a modern industrial city. It is a good example to illustrate the urban construction and cultural heritage through the process and impact of social transformation. An overall perspective on transformation will contribute to develop the city`s uniqueness and enhance its inclusive development. The study chooses the history of Wuhan from 1861 to 1957 as the study period. The whole transformation process will be divided into four typical periods based on key historical events, and the paper analyzes the changes on urban structure and constructions activities in each period. Then, a lot of examples are used to compare the features of Wuhan modern city heritage in the four periods. In this way, three characteristics of Wuhan modern city heritage are summarized. The paper finds that globalization and localization worked together to shape the urban physical space environment. For Wuhan, social transformation has a profound and comprehensive impact on urban construction, which can be analyzed in the aspects of main construction, architecture style, location and actors. Moreover, the three towns of Wuhan have a disparate cityscape that is reflected by the varied heritages and architecture features over different transformation periods. Lastly, the protection regulations and conservation planning of heritage in Wuhan are discussed, and suggestions about the conservation of Wuhan modern heritage are tried to be drawn. The implications of the study are providing a new perspective on modern city heritage for cities like Wuhan, and the future local planning system and heritage conservation policies can take into consideration the ‘Modern Cultural Transformation Route’ in this paper.Keywords: modern city heritage, transformation, identity, Wuhan
Procedia PDF Downloads 13112397 Treatment of Low-Grade Iron Ore Using Two Stage Wet High-Intensity Magnetic Separation Technique
Authors: Moses C. Siame, Kazutoshi Haga, Atsushi Shibayama
Abstract:
This study investigates the removal of silica, alumina and phosphorus as impurities from Sanje iron ore using wet high-intensity magnetic separation (WHIMS). Sanje iron ore contains low-grade hematite ore found in Nampundwe area of Zambia from which iron is to be used as the feed in the steelmaking process. The chemical composition analysis using X-ray Florence spectrometer showed that Sanje low-grade ore contains 48.90 mass% of hematite (Fe2O3) with 34.18 mass% as an iron grade. The ore also contains silica (SiO2) and alumina (Al2O3) of 31.10 mass% and 7.65 mass% respectively. The mineralogical analysis using X-ray diffraction spectrometer showed hematite and silica as the major mineral components of the ore while magnetite and alumina exist as minor mineral components. Mineral particle distribution analysis was done using scanning electron microscope with an X-ray energy dispersion spectrometry (SEM-EDS) and images showed that the average mineral size distribution of alumina-silicate gangue particles is in order of 100 μm and exists as iron-bearing interlocked particles. Magnetic separation was done using series L model 4 Magnetic Separator. The effect of various magnetic separation parameters such as magnetic flux density, particle size, and pulp density of the feed was studied during magnetic separation experiments. The ore with average particle size of 25 µm and pulp density of 2.5% was concentrated using pulp flow of 7 L/min. The results showed that 10 T was optimal magnetic flux density which enhanced the recovery of 93.08% of iron with 53.22 mass% grade. The gangue mineral particles containing 12 mass% silica and 3.94 mass% alumna remained in the concentrate, therefore the concentrate was further treated in the second stage WHIMS using the same parameters from the first stage. The second stage process recovered 83.41% of iron with 67.07 mass% grade. Silica was reduced to 2.14 mass% and alumina to 1.30 mass%. Accordingly, phosphorus was also reduced to 0.02 mass%. Therefore, the two stage magnetic separation process was established using these results.Keywords: Sanje iron ore, magnetic separation, silica, alumina, recovery
Procedia PDF Downloads 25812396 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.Keywords: apartment complex, big data, life-cycle building value analysis, machine learning
Procedia PDF Downloads 37412395 Rational Allocation of Resources in Water Infrastructure Development Projects
Authors: M. Macchiaroli, V. Pellecchia, L. Dolores
Abstract:
Within any European and world model of management of the integrated water service (in Italy only since 2012 is regulated by a national Authority, that is ARERA), a significant part is covered by the development of assets in terms of hydraulic networks and wastewater collection networks, including all their relative building works. The process of selecting the investments to be made starts from the preventive analysis of critical issues (water losses, unserved areas, low service standards, etc.) who occur in the managed territory of the Operator. Through the Program of Interventions (Provision by ARERA n. 580/2019/R/idr), the Operator provides to program the projects that can meet the emerged needs to determine the improvement of the water service levels. This phase (analyzed and solved by the author with a work published in 2019) involves the use of evaluation techniques (cost-benefit analysis, multi-criteria, and multi-objective techniques, neural networks, etc.) useful in selecting the most appropriate design answers to the different criticalities. However, at this point, the problem of establishing the time priorities between the various works deemed necessary remains open. That is, it is necessary to hierarchize the investments. In this decision-making moment, the interests of the private Operator are often opposed, which favors investments capable of generating high profitability, compared to those of the public controller (ARERA), which favors investments in greater social impact. In support of the concertation between these two actors, the protocol set out in the research has been developed, based on the AHP and capable of borrowing from the programmatic documents an orientation path for the settlement of the conflict. The protocol is applied to a case study of the Campania Region in Italy and has been professionally applied in the shared decision process between the manager and the local Authority.Keywords: analytic hierarchy process, decision making, economic evaluation of projects, integrated water service
Procedia PDF Downloads 12312394 On the Use of Reliability Factors to Reduce Conflict between Information Sources in Dempster-Shafer Theory
Authors: A. Alem, Y. Dahmani, A. Hadjali, A. Boualem
Abstract:
Managing the problem of the conflict, either by using the Dempster-Shafer theory, or by the application of the fusion process to push researchers in recent years to find ways to get to make best decisions especially; for information systems, vision, robotic and wireless sensor networks. In this paper we are interested to take account of the conflict in the combination step that took the conflict into account and tries to manage such a way that it does not influence the decision step, the conflict what from reliable sources. According to [1], the conflict lead to erroneous decisions in cases where was with strong degrees between sources of information, if the conflict is more than the maximum of the functions of belief mass K > max1...n (mi (A)), then the decision becomes impossible. We will demonstrate in this paper that the multiplication of mass functions by coefficients of reliability is a decreasing function; it leads to the reduction of conflict and a good decision. The definition of reliability coefficients accurately and multiply them by the mass functions of each information source to resolve the conflict and allow deciding whether the degree of conflict. The evaluation of this technique is done by a use case; a comparison of the combination of springs with a maximum conflict without, and with reliability coefficients.Keywords: Dempster-Shafer theory, fusion process, conflict managing, reliability factors, decision
Procedia PDF Downloads 42612393 Design an Development of an Agorithm for Prioritizing the Test Cases Using Neural Network as Classifier
Authors: Amit Verma, Simranjeet Kaur, Sandeep Kaur
Abstract:
Test Case Prioritization (TCP) has gained wide spread acceptance as it often results in good quality software free from defects. Due to the increase in rate of faults in software traditional techniques for prioritization results in increased cost and time. Main challenge in TCP is difficulty in manually validate the priorities of different test cases due to large size of test suites and no more emphasis are made to make the TCP process automate. The objective of this paper is to detect the priorities of different test cases using an artificial neural network which helps to predict the correct priorities with the help of back propagation algorithm. In our proposed work one such method is implemented in which priorities are assigned to different test cases based on their frequency. After assigning the priorities ANN predicts whether correct priority is assigned to every test case or not otherwise it generates the interrupt when wrong priority is assigned. In order to classify the different priority test cases classifiers are used. Proposed algorithm is very effective as it reduces the complexity with robust efficiency and makes the process automated to prioritize the test cases.Keywords: test case prioritization, classification, artificial neural networks, TF-IDF
Procedia PDF Downloads 39712392 Understanding Team Member Autonomy and Team Collaboration: A Qualitative Study
Authors: Ayşen Bakioğlu, Gökçen Seyra Çakır
Abstract:
This study aims to explore how research assistants who work in project teams experience team member autonomy and how they reconcile team member autonomy with team collaboration. The study utilizes snowball sampling. 20 research assistants who work the faculties of education in Marmara University and Yıldız Technical University have been interviewed. The analysis of data involves a content analysis MAXQDAPlus 11 which is a qualitative data analysis software is used as the data analysis tool. According to the findings of this study, emerging themes include team norm formation, team coordination management, the role of individual tasks in team collaboration, leadership distribution. According to the findings, interviewees experience team norm formation process in terms of processes, which pertain to task fulfillment, and processes, which pertain to the regulation of team dynamics. Team norm formation process instills a sense of responsibility amongst individual team members. Apart from that, the interviewees’ responses indicate that the realization of the obligation to work in a team contributes to the team norm formation process. The participants indicate that individual expectations are taken into consideration during the coordination of the team. The supervisor of the project team also has a crucial role in maintaining team collaboration. Coordination problems arise when an individual team member does not relate his/her academic field with the research topic of the project team. The findings indicate that the leadership distribution in the project teams involves two leadership processes: leadership distribution which is based on the processes that focus on individual team members and leadership distribution which is based on the processes that focus on team interaction. Apart from that, individual tasks serve as a facilitator of collaboration amongst team members. Interviewees also indicate that individual tasks also facilitate the expression of individuality.Keywords: project teams in higher education, research assistant teams, team collaboration, team member autonomy
Procedia PDF Downloads 36212391 The Asymptotic Hole Shape in Long Pulse Laser Drilling: The Influence of Multiple Reflections
Authors: Torsten Hermanns, You Wang, Stefan Janssen, Markus Niessen, Christoph Schoeler, Ulrich Thombansen, Wolfgang Schulz
Abstract:
In long pulse laser drilling of metals, it can be demonstrated that the ablation shape approaches a so-called asymptotic shape such that it changes only slightly or not at all with further irradiation. These findings are already known from ultra short pulse (USP) ablation of dielectric and semiconducting materials. The explanation for the occurrence of an asymptotic shape in long pulse drilling of metals is identified, a model for the description of the asymptotic hole shape numerically implemented, tested and clearly confirmed by comparison with experimental data. The model assumes a robust process in that way that the characteristics of the melt flow inside the arising melt film does not change qualitatively by changing the laser or processing parameters. Only robust processes are technically controllable and thus of industrial interest. The condition for a robust process is identified by a threshold for the mass flow density of the assist gas at the hole entrance which has to be exceeded. Within a robust process regime the melt flow characteristics can be captured by only one model parameter, namely the intensity threshold. In analogy to USP ablation (where it is already known for a long time that the resulting hole shape results from a threshold for the absorbed laser fluency) it is demonstrated that in the case of robust long pulse ablation the asymptotic shape forms in that way that along the whole contour the absorbed heat flux density is equal to the intensity threshold. The intensity threshold depends on the special material and radiation properties and has to be calibrated be one reference experiment. The model is implemented in a numerical simulation which is called AsymptoticDrill and requires such a few amount of resources that it can run on common desktop PCs, laptops or even smart devices. Resulting hole shapes can be calculated within seconds what depicts a clear advantage over other simulations presented in literature in the context of industrial every day usage. Against this background the software additionally is equipped with a user-friendly GUI which allows an intuitive usage. Individual parameters can be adjusted using sliders while the simulation result appears immediately in an adjacent window. A platform independent development allow a flexible usage: the operator can use the tool to adjust the process in a very convenient manner on a tablet during the developer can execute the tool in his office in order to design new processes. Furthermore, at the best knowledge of the authors AsymptoticDrill is the first simulation which allows the import of measured real beam distributions and thus calculates the asymptotic hole shape on the basis of the real state of the specific manufacturing system. In this paper the emphasis is placed on the investigation of the effect of multiple reflections on the asymptotic hole shape which gain in importance when drilling holes with large aspect ratios.Keywords: asymptotic hole shape, intensity threshold, long pulse laser drilling, robust process
Procedia PDF Downloads 21312390 Early Recognition and Grading of Cataract Using a Combined Log Gabor/Discrete Wavelet Transform with ANN and SVM
Authors: Hadeer R. M. Tawfik, Rania A. K. Birry, Amani A. Saad
Abstract:
Eyes are considered to be the most sensitive and important organ for human being. Thus, any eye disorder will affect the patient in all aspects of life. Cataract is one of those eye disorders that lead to blindness if not treated correctly and quickly. This paper demonstrates a model for automatic detection, classification, and grading of cataracts based on image processing techniques and artificial intelligence. The proposed system is developed to ease the cataract diagnosis process for both ophthalmologists and patients. The wavelet transform combined with 2D Log Gabor Wavelet transform was used as feature extraction techniques for a dataset of 120 eye images followed by a classification process that classified the image set into three classes; normal, early, and advanced stage. A comparison between the two used classifiers, the support vector machine SVM and the artificial neural network ANN were done for the same dataset of 120 eye images. It was concluded that SVM gave better results than ANN. SVM success rate result was 96.8% accuracy where ANN success rate result was 92.3% accuracy.Keywords: cataract, classification, detection, feature extraction, grading, log-gabor, neural networks, support vector machines, wavelet
Procedia PDF Downloads 33212389 Mango (Mangifera indica L.) Lyophilization Using Vacuum-Induced Freezing
Authors: Natalia A. Salazar, Erika K. Méndez, Catalina Álvarez, Carlos E. Orrego
Abstract:
Lyophilization, also called freeze-drying, is an important dehydration technique mainly used for pharmaceuticals. Food industry also uses lyophilization when it is important to retain most of the nutritional quality, taste, shape and size of dried products and to extend their shelf life. Vacuum-Induced during freezing cycle (VI) has been used in order to control ice nucleation and, consequently, to reduce the time of primary drying cycle of pharmaceuticals preserving quality properties of the final product. This procedure has not been applied in freeze drying of foods. The present work aims to investigate the effect of VI on the lyophilization drying time, final moisture content, density and reconstitutional properties of mango (Mangifera indica L.) slices (MS) and mango pulp-maltodextrin dispersions (MPM) (30% concentration of total solids). Control samples were run at each freezing rate without using induced vacuum. The lyophilization endpoint was the same for all treatments (constant difference between capacitance and Pirani vacuum gauges). From the experimental results it can be concluded that at the high freezing rate (0.4°C/min) reduced the overall process time up to 30% comparing process time required for the control and VI of the lower freeze rate (0.1°C/min) without affecting the quality characteristics of the dried product, which yields a reduction in costs and energy consumption for MS and MPM freeze drying. Controls and samples treated with VI at freezing rate of 0.4°C/min in MS showed similar results in moisture and density parameters. Furthermore, results from MPM dispersion showed favorable values when VI was applied because dried product with low moisture content and low density was obtained at shorter process time compared with the control. There were not found significant differences between reconstitutional properties (rehydration for MS and solubility for MPM) of freeze dried mango resulting from controls, and VI treatments.Keywords: drying time, lyophilization, mango, vacuum induced freezing
Procedia PDF Downloads 41012388 Study and Improvement of the Quality of a Production Line
Authors: S. Bouchami, M.N. Lakhoua
Abstract:
The automotive market is a dynamic market that continues to grow. That’s why several companies belonging to this sector adopt a quality improvement approach. Wanting to be competitive and successful in the environment in which they operate, these companies are dedicated to establishing a system of quality management to ensure the achievement of the objective quality, improving the products and process as well as the satisfaction of the customers. In this paper, the management of the quality and the improvement of a production line in an industrial company is presented. In fact, the project is divided into two essential parts: the creation of the technical line documentation and the quality assurance documentation and the resolution of defects at the line, as well as those claimed by the customer. The creation of the documents has required a deep understanding of the manufacturing process. The analysis and problem solving were done through the implementation of PDCA (Plan Do Check Act) and FTA (Fault Tree Analysis). As perspective, in order to better optimize production and improve the efficiency of the production line, a study on the problems associated with the supply of raw materials should be made to solve the problems of stock-outs which cause delays penalizing for the industrial company.Keywords: quality management, documentary system, Plan Do Check Act (PDCA), fault tree analysis (FTA) method
Procedia PDF Downloads 14212387 “Octopub”: Geographical Sentiment Analysis Using Named Entity Recognition from Social Networks for Geo-Targeted Billboard Advertising
Authors: Oussama Hafferssas, Hiba Benyahia, Amina Madani, Nassima Zeriri
Abstract:
Although data nowadays has multiple forms; from text to images, and from audio to videos, yet text is still the most used one at a public level. At an academical and research level, and unlike other forms, text can be considered as the easiest form to process. Therefore, a brunch of Data Mining researches has been always under its shadow, called "Text Mining". Its concept is just like data mining’s, finding valuable patterns in data, from large collections and tremendous volumes of data, in this case: Text. Named entity recognition (NER) is one of Text Mining’s disciplines, it aims to extract and classify references such as proper names, locations, expressions of time and dates, organizations and more in a given text. Our approach "Octopub" does not aim to find new ways to improve named entity recognition process, rather than that it’s about finding a new, and yet smart way, to use NER in a way that we can extract sentiments of millions of people using Social Networks as a limitless information source, and Marketing for product promotion as the main domain of application.Keywords: textmining, named entity recognition(NER), sentiment analysis, social media networks (SN, SMN), business intelligence(BI), marketing
Procedia PDF Downloads 58912386 Biochar as a Strong Adsorbent for Multiple-Metal Removal from Contaminated Water
Authors: Eman H. El-Gamal, Mai E. Khedr, Randa Ghonim, Mohamed Rashad
Abstract:
In the past few years, biochar - a highly carbon-rich material produced from agro-wastes by pyrolysis process - was used as an effective adsorbent for heavy metals removal from polluted water. In this study, different types of biochar (rice straw 'RSB', corn cob 'CCB', and Jatropha shell 'JSB' were used to evaluate the adsorption capacity of heavy metals removal from multiple-metal solutions (Cu, Mn, Zn, and Cd). Kinetics modeling has been examined to illustrate potential adsorption mechanisms. The results showed that the potential removal of metal is dependent on the metal and biochar types. The adsorption capacity of the biochars followed the order: RSB > JSB > CCB. In general, RSB and JSB biochars presented high potential removal of heavy metals from polluted water, which was higher than 90 and 80% after 2 hrs of contact time for all metals, respectively. According to the kinetics data, the pseudo-second-order model was agreed strongly with Cu, Mn, Zn, and Cd adsorption onto the biochars (R2 ≥ 0.97), indicating the dominance of specific adsorption process, i.e., chemisorption. In conclusion, this study revealed that RSB and JSB biochar have the potential to be a strong adsorbent for multiple-metal removal from wastewater.Keywords: adsorption, biochar, chemisorption, polluted water
Procedia PDF Downloads 15012385 From Binary Solutions to Real Bio-Oils: A Multi-Step Extraction Story of Phenolic Compounds with Ionic Liquid
Authors: L. Cesari, L. Canabady-Rochelle, F. Mutelet
Abstract:
The thermal conversion of lignin produces bio-oils that contain many compounds with high added-value such as phenolic compounds. In order to efficiently extract these compounds, the possible use of choline bis(trifluoromethylsulfonyl)imide [Choline][NTf2] ionic liquid was explored. To this end, a multistep approach was implemented. First, binary (phenolic compound and solvent) and ternary (phenolic compound and solvent and ionic liquid) solutions were investigated. Eight binary systems of phenolic compound and water were investigated at atmospheric pressure. These systems were quantified using the turbidity method and UV-spectroscopy. Ternary systems (phenolic compound and water and [Choline][NTf2]) were investigated at room temperature and atmospheric pressure. After stirring, the solutions were let to settle down, and a sample of each phase was collected. The analysis of the phases was performed using gas chromatography with an internal standard. These results were used to quantify the values of the interaction parameters of thermodynamic models. Then, extractions were performed on synthetic solutions to determine the influence of several operating conditions (temperature, kinetics, amount of [Choline][NTf2]). With this knowledge, it has been possible to design and simulate an extraction process composed of one extraction column and one flash. Finally, the extraction efficiency of [Choline][NTf2] was quantified with real bio-oils from lignin pyrolysis. Qualitative and quantitative analysis were performed using gas chromatographic connected to mass spectroscopy and flame ionization detector. The experimental measurements show that the extraction of phenolic compounds is efficient at room temperature, quick and does not require a high amount of [Choline][NTf2]. Moreover, the simulations of the extraction process demonstrate that [Choline][NTf2] process requires less energy than an organic one. Finally, the efficiency of [Choline][NTf2] was confirmed in real situations with the experiments on lignin pyrolysis bio-oils.Keywords: bio-oils, extraction, lignin, phenolic compounds
Procedia PDF Downloads 11012384 Bayes Estimation of Parameters of Binomial Type Rayleigh Class Software Reliability Growth Model using Non-informative Priors
Authors: Rajesh Singh, Kailash Kale
Abstract:
In this paper, the Binomial process type occurrence of software failures is considered and failure intensity has been characterized by one parameter Rayleigh class Software Reliability Growth Model (SRGM). The proposed SRGM is mathematical function of parameters namely; total number of failures i.e. η-0 and scale parameter i.e. η-1. It is assumed that very little or no information is available about both these parameters and then considering non-informative priors for both these parameters, the Bayes estimators for the parameters η-0 and η-1 have been obtained under square error loss function. The proposed Bayes estimators are compared with their corresponding maximum likelihood estimators on the basis of risk efficiencies obtained by Monte Carlo simulation technique. It is concluded that both the proposed Bayes estimators of total number of failures and scale parameter perform well for proper choice of execution time.Keywords: binomial process, non-informative prior, maximum likelihood estimator (MLE), rayleigh class, software reliability growth model (SRGM)
Procedia PDF Downloads 38912383 The Use of a Miniature Bioreactor as Research Tool for Biotechnology Process Development
Authors: Muhammad Zainuddin Arriafdi, Hamudah Hakimah Abdullah, Mohd Helmi Sani, Wan Azlina Ahmad, Muhd Nazrul Hisham Zainal Alam
Abstract:
The biotechnology process development demands numerous experimental works. In laboratory environment, this is typically carried out using a shake flask platform. This paper presents the design and fabrication of a miniature bioreactor system as an alternative research tool for bioprocessing. The working volume of the reactor is 100 ml, and it is made of plastic. The main features of the reactor included stirring control, temperature control via the electrical heater, aeration strategy through a miniature air compressor, and online optical cell density (OD) sensing. All sensors and actuators integrated into the reactor was controlled using an Arduino microcontroller platform. In order to demonstrate the functionality of such miniature bioreactor concept, series of batch Saccharomyces cerevisiae fermentation experiments were performed under various glucose concentrations. Results attained from the fermentation experiments were utilized to solve the Monod equation constants, namely the saturation constant, Ks, and cells maximum growth rate, μmax as to further highlight the usefulness of the device. The mixing capacity of the reactor was also evaluated. It was found that the results attained from the miniature bioreactor prototype were comparable to results achieved using a shake flask. The unique features of the device as compared to shake flask platform is that the reactor mixing condition is much more comparable to a lab-scale bioreactor setup. The prototype is also integrated with an online OD sensor, and as such, no sampling was needed to monitor the progress of the reaction performed. Operating cost and medium consumption are also low and thus, making it much more economical to be utilized for biotechnology process development compared to lab-scale bioreactors.Keywords: biotechnology, miniature bioreactor, research tools, Saccharomyces cerevisiae
Procedia PDF Downloads 11712382 Classification of Random Doppler-Radar Targets during the Surveillance Operations
Authors: G. C. Tikkiwal, Mukesh Upadhyay
Abstract:
During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving the army, moving convoys etc. The radar operator selects one of the promising targets into single target tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper, we present a technique using mathematical and statistical methods like fast fourier transformation (FFT) and principal component analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.Keywords: radar target, FFT, principal component analysis, eigenvector, octave-notes, DSP
Procedia PDF Downloads 39412381 Ergonomics Management and Sustainability: An Exploratory Study Applied to Automaker Industry in South of Brazil
Authors: Giles Balbinotti, Lucas Balbinotti, Paula Hembecker
Abstract:
The management of the productive process project activities, for the conception of future work and for the financial health of the companies, is an important condition in an organizational model that corroborates the management of the human aspects and their variabilities existing in the work. It is important to seek, at all levels of the organization, understanding and consequent cultural change, and so that factors associated with human aspects are considered and prioritized in the projects. In this scenario, the central question of research for this study is placed from the context of the work, in which the managers and project coordinators are inserted, as follows: How is the top management convinced, in the design stages, to take The ‘Ergonomics’ as strategy for the performance and sustainability of the business? In this perspective, this research has as general objective to analyze how the application of the management of the human aspects in a real project of productive process in the automotive industry, including the activity of the manager and coordinator of the project beyond the strategies of convincing to act in the ergonomics of design. For this, the socio-technical and ergonomic approach is adopted, given its anthropocentric premise in the sense of acting on the social system simultaneously to the technical system, besides the support of the Modapts system that measures the non-value-added times and the correlation with the Critical positions. The methodological approach adopted in this study is based on a review of the literature and the analysis of the activity of the project coordinators of an industry, including the management of human aspects in the context of work variability and the strategies applied in project activities. It was observed in the study that the loss of performance of the serial production lines reaches the important number of the order of 30%, which can make the operation with not value-added, and this loss has as one of the causes, the ergonomic problems present in the professional activity.Keywords: human aspects in production process project, ergonomics in design, sociotechnical project management, sociotechnical, ergonomic principles, sustainability
Procedia PDF Downloads 25112380 Association of Selected Polymorphisms of BER Pathway with the Risk of Colorectal Cancer in the Polish Population
Authors: Jacek Kabzinski, Karolina Przybylowska, Lukasz Dziki, Adam Dziki, Ireneusz Majsterek
Abstract:
The incidence of colorectal cancer (CRC) is increasing from year to year. Despite intensive research CRC etiology remains unknown. Studies suggest that at the basis of the process of carcinogenesis can lie reduced efficiency of DNA repair mechanisms, often caused by polymorphisms in DNA repair genes. The aim of the study was to determine the relationship between gene polymorphisms Pro242Arg of PolB gene and Arg780His of Lig3 gene and modulation of the risk of colorectal cancer in the Polish population. Determination of the molecular basis of carcinogenesis process and predicting increased risk will allow qualifying patients to increased risk group and including them in preventive program. We used blood collected from 110 patients diagnosed with colorectal cancer. The control group consisted of equal number of healthy people. Genotyping was performed by TaqMan method. The obtained results indicate that the genotype 780Arg/His of Lig3 gene is associated with an increased risk of colorectal cancer. On the basis of these results, we conclude that Lig3 gene polymorphism Arg780His may be associated with an increased risk of colorectal cancer.Keywords: BER, colorectal cancer, PolB, Lig3, polymorphisms
Procedia PDF Downloads 45412379 Procedure for Monitoring the Process of Behavior of Thermal Cracking in Concrete Gravity Dams: A Case Study
Authors: Adriana de Paula Lacerda Santos, Bruna Godke, Mauro Lacerda Santos Filho
Abstract:
Several dams in the world have already collapsed, causing environmental, social and economic damage. The concern to avoid future disasters has stimulated the creation of a great number of laws and rules in many countries. In Brazil, Law 12.334/2010 was created, which establishes the National Policy on Dam Safety. Overall, this policy requires the dam owners to invest in the maintenance of their structures and to improve its monitoring systems in order to provide faster and straightforward responses in the case of an increase of risks. As monitoring tools, visual inspections has provides comprehensive assessment of the structures performance, while auscultation’s instrumentation has added specific information on operational or behavioral changes, providing an alarm when a performance indicator exceeds the acceptable limits. These limits can be set using statistical methods based on the relationship between instruments measures and other variables, such as reservoir level, time of the year or others instruments measuring. Besides the design parameters (uplift of the foundation, displacements, etc.) the dam instrumentation can also be used to monitor the behavior of defects and damage manifestations. Specifically in concrete gravity dams, one of the main causes for the appearance of cracks, are the concrete volumetric changes generated by the thermal origin phenomena, which are associated with the construction process of these structures. Based on this, the goal of this research is to propose a monitoring process of the thermal cracking behavior in concrete gravity dams, through the instrumentation data analysis and the establishment of control values. Therefore, as a case study was selected the Block B-11 of José Richa Governor Dam Power Plant, that presents a cracking process, which was identified even before filling the reservoir in August’ 1998, and where crack meters and surface thermometers were installed for its monitoring. Although these instruments were installed in May 2004, the research was restricted to study the last 4.5 years (June 2010 to November 2014), when all the instruments were calibrated and producing reliable data. The adopted method is based on simple linear correlations procedures to understand the interactions among the instruments time series, verifying the response times between them. The scatter plots were drafted from the best correlations, which supported the definition of the limit control values. Among the conclusions, it is shown that there is a strong or very strong correlation between ambient temperature and the crack meters and flowmeters measurements. Based on the results of the statistical analysis, it was possible to develop a tool for monitoring the behavior of the case study cracks. Thus it was fulfilled the goal of the research to develop a proposal for a monitoring process of the behavior of thermal cracking in concrete gravity dams.Keywords: concrete gravity dam, dams safety, instrumentation, simple linear correlation
Procedia PDF Downloads 29212378 Reduction Behavior of Medium Grade Manganese Ore from Karangnunggal during a Sintering Process in Methane Gas
Authors: H. Aripin, I. Made Joni, Edvin Priatna, Nundang Busaeri, Svilen Sabchevski
Abstract:
In this investigation, manganese has been produced from medium grade manganese ore from Karangnunggal mine (West Java, Indonesia). The ores were grinded using a jar mill to pass through a 150 mesh sieve. The effects of keeping it at a temperature of 1200 °C in methane gas on the structural properties have been studied. The material’s properties have been characterized on the basis of the experimental data obtained using X-ray fluorescence (XRF), X-ray diffraction (XRD), Scanning Electron Microscopy (SEM), and Fourier transform infrared (FTIR) spectroscopy. It has been found that the ore contains MnO₂ as the main constituents at about 46.80 wt.%. It can be also observed that the ore particles are agglomerated forming dense grains with different texture and morphology. The irregular-shaped grains with dark contrast, the large brighter grains, and smaller grains with bright texture and smooth surfaces are associated with the presence of manganese, calcium, and quartz, respectively. From XRD patterns, MnO₂ is reduced to hausmannite (Mn₃O₄), manganosite (MnO) and manganese carbide (Mn₇C₃). At a temperature of 1200°C the keeping time does not have any effect on the formation of crystals and the crystalline phases remain almost unchanged in the time range from 15 to 90 minutes. An increase of the keeping time up to 45 minutes during the sintering process leads to an increase of the MnO concentration, while at 90 minutes, the concentration decreases. At longer keeping times the excess reaction of the methane gas and manganese oxide in the ore causes an increase of carbon deposition. As a result, it blocks the particle surface and then hinders the reduction process of manganese oxide. From FTIR spectrum allows one to explain that the appearance of C=O stretching mode arises from absorption of atmospheric methane and manganese oxide of the ore. The intensity of this band increases with increasing the keeping time, indicating an increase of carbon deposition on the surface of manganese oxide.Keywords: manganese, medium grade manganese ore, structural properties, keeping the temperature, carbon deposition
Procedia PDF Downloads 15512377 The Implementation of the Javanese Lettered-Manuscript Image Preprocessing Stage Model on the Batak Lettered-Manuscript Image
Authors: Anastasia Rita Widiarti, Agus Harjoko, Marsono, Sri Hartati
Abstract:
This paper presents the results of a study to test whether the Javanese character manuscript image preprocessing model that have been more widely applied, can also be applied to segment of the Batak characters manuscripts. The treatment process begins by converting the input image into a binary image. After the binary image is cleaned of noise, then the segmentation lines using projection profile is conducted. If unclear histogram projection is found, then the smoothing process before production indexes line segments is conducted. For each line image which has been produced, then the segmentation scripts in the line is applied, with regard of the connectivity between pixels which making up the letters that there is no characters are truncated. From the results of manuscript preprocessing system prototype testing, it is obtained the information about the system truth percentage value on pieces of Pustaka Batak Podani Ma AjiMamisinon manuscript ranged from 65% to 87.68% with a confidence level of 95%. The value indicates the truth percentage shown the initial processing model in Javanese characters manuscript image can be applied also to the image of the Batak characters manuscript.Keywords: connected component, preprocessing, manuscript image, projection profiles
Procedia PDF Downloads 40012376 An Internet of Things-Based Weight Monitoring System for Honey
Authors: Zheng-Yan Ruan, Chien-Hao Wang, Hong-Jen Lin, Chien-Peng Huang, Ying-Hao Chen, En-Cheng Yang, Chwan-Lu Tseng, Joe-Air Jiang
Abstract:
Bees play a vital role in pollination. This paper focuses on the weighing process of honey. Honey is usually stored at the comb in a hive. Bee farmers brush bees away from the comb and then collect honey, and the collected honey is weighed afterward. However, such a process brings strong negative influences on bees and even leads to the death of bees. This paper therefore presents an Internet of Things-based weight monitoring system which uses weight sensors to measure the weight of honey and simplifies the whole weighing procedure. To verify the system, the weight measured by the system is compared to the weight of standard weights used for calibration by employing a linear regression model. The R2 of the regression model is 0.9788, which suggests that the weighing system is highly reliable and is able to be applied to obtain actual weight of honey. In the future, the weight data of honey can be used to find the relationship between honey production and different ecological parameters, such as bees’ foraging behavior and weather conditions. It is expected that the findings can serve as critical information for honey production improvement.Keywords: internet of things, weight, honey, bee
Procedia PDF Downloads 45912375 Effects of Convective Momentum Transport on the Cyclones Intensity: A Case Study
Authors: José Davi Oliveira De Moura, Chou Sin Chan
Abstract:
In this study, the effect of convective momentum transport (CMT) on the life of cyclone systems and their organization is analyzed. A case of strong precipitation, in the southeast of Brazil, was simulated using Eta model with two kinds of convective parameterization: Kain-Fritsch without CMT and Kain-fritsch with CMT. Reanalysis data from CFSR were used to compare Eta model simulations. The Wind, mean sea level pressure, rain and temperature are included in analysis. The rain was evaluated by Equitable Threat Score (ETS) and Bias Index; the simulations were compared among themselves to detect the influence of CMT displacement on the systems. The result shows that CMT process decreases the intensity of meso cyclones (higher pressure values on nuclei) and change the positions and production of rain. The decrease of intensity in meso cyclones should be caused by the dissolution of momentum from lower levels from up levels. The rain production and rain distribution were altered because the displacement of the larger systems scales was changed. In addition, the inclusion of CMT process is very important to improve the simulation of life time of meteorological systems.Keywords: convection, Kain-Fritsch, momentum, parameterization
Procedia PDF Downloads 32512374 Methodology of Automation and Supervisory Control and Data Acquisition for Restructuring Industrial Systems
Authors: Lakhoua Najeh
Abstract:
Introduction: In most situations, an industrial system already existing, conditioned by its history, its culture and its context are in difficulty facing the necessity to restructure itself in an organizational and technological environment in perpetual evolution. This is why all operations of restructuring first of all require a diagnosis based on a functional analysis. After a presentation of the functionality of a supervisory system for complex processes, we present the concepts of industrial automation and supervisory control and data acquisition (SCADA). Methods: This global analysis exploits the various available documents on the one hand and takes on the other hand in consideration the various testimonies through investigations, the interviews or the collective workshops; otherwise, it also takes observations through visits as a basis and even of the specific operations. The exploitation of this diagnosis enables us to elaborate the project of restructuring thereafter. Leaving from the system analysis for the restructuring of industrial systems, and after a technical diagnosis based on visits, an analysis of the various technical documents and management as well as on targeted interviews, a focusing retailing the various levels of analysis has been done according a general methodology. Results: The methodology adopted in order to contribute to the restructuring of industrial systems by its participative and systemic character and leaning on a large consultation a lot of human resources that of the documentary resources, various innovating actions has been proposed. These actions appear in the setting of the TQM gait requiring applicable parameter quantification and a treatment valorising some information. The new management environment will enable us to institute an information and communication system possibility of migration toward an ERP system. Conclusion: Technological advancements in process monitoring, control and industrial automation over the past decades have contributed greatly to improve the productivity of virtually all industrial systems throughout the world. This paper tries to identify the principles characteristics of a process monitoring, control and industrial automation in order to provide tools to help in the decision-making process.Keywords: automation, supervision, SCADA, TQM
Procedia PDF Downloads 17712373 Effect of Plasma Radiation on Keratinocyte Cells Involved in the Wound Healing Process
Authors: B. Fazekas, I. Korolov, K. Kutasi
Abstract:
Plasma medicine, which involves the use of gas discharge plasmas for medical applications is a rapidly growing research field. The use of non-thermal atmospheric pressure plasmas in dermatology to assist tissue regeneration by improving the healing of infected and/or chronic wounds is a promising application. It is believed that plasma can activate cells, which are involved in the wound closure. Non-thermal atmospheric plasmas are rich in chemically active species (such as O and N-atoms, O2(a) molecules) and radiative species such as the NO, N2+ and N2 excited molecules, which dominantly radiate in the 200-500 nm spectral range. In order to understand the effect of plasma species, both of chemically active and radiative species on wound healing process, the interaction of physical plasma with the human skin cells is necessary. In order to clarify the effect of plasma radiation on the wound healing process we treated keratinocyte cells – that are one of the main cell types in human skin epidermis – covered with a layer of phosphate-buffered saline (PBS) with a low power atmospheric pressure plasma. For the generation of such plasma we have applied a plasma needle. Here, the plasma is ignited at the tip of the needle in flowing helium gas in contact with the ambient air. To study the effect of plasma radiation we used a plasma needle configuration, where the plasma species – chemically active radicals and charged species – could not reach the treated cells, but only the radiation. For the comparison purposes, we also irradiated the cells using a UV-B light source (FS20 lamp) with a 20 and 40 mJ cm-2 dose of 312 nm. After treatment the viability and the proliferation of the cells have been examined. The proliferation of cells has been studied with a real time monitoring system called Xcelligence. The results have indicated, that the 20 mJ cm-2 dose did not affect cell viability, whereas the 40 mJ cm-2 dose resulted a decrease in cell viability. The results have shown that the plasma radiation have no quantifiable effect on the cell proliferation as compared to the non-treated cells.Keywords: UV radiation, non-equilibrium gas discharges (non-thermal plasmas), plasma emission, keratinocyte cells
Procedia PDF Downloads 60212372 Ultrasonic Agglomeration of Protein Matrices and Its Effect on Thermophysical, Macro- and Microstructural Properties
Authors: Daniela Rivera-Tobar Mario Perez-Won, Roberto Lemus-Mondaca, Gipsy Tabilo-Munizaga
Abstract:
Different dietary trends worldwide seek to consume foods with anti-inflammatory properties, rich in antioxidants, proteins, and unsaturated fatty acids that lead to better metabolic, intestinal, mental, and cardiac health. In this sense, food matrices with high protein content based on macro and microalgae are an excellent alternative to meet the new needs of consumers. An emerging and environmentally friendly technology for producing protein matrices is ultrasonic agglomeration. It consists of the formation of permanent bonds between particles, improving the agglomeration of the matrix compared to conventionally agglomerated products (compression). Among the advantages of this process are the reduction of nutrient loss and the avoidance of binding agents. The objective of this research was to optimize the ultrasonic agglomeration process in matrices composed of Spirulina (Arthrospira platensis) powder and Cochayuyo (Durvillae Antartica) flour, by means of the response variable (Young's modulus) and the independent variables were the process conditions (percentage of ultrasonic amplitude: 70, 80 and 90; ultrasonic agglomeration times and cycles: 20, 25 and 30 seconds, and 3, 4 and 5). It was evaluated using a central composite design and analyzed using response surface methodology. In addition, the effects of agglomeration on thermophysical and microstructural properties were evaluated. It was determined that ultrasonic compression with 80 and 90% amplitude caused conformational changes according to Fourier infrared spectroscopy (FTIR) analysis, the best condition with respect to observed microstructure images (SEM) and differential scanning calorimetry (DSC) analysis, was the condition of 90% amplitude 25 and 30 seconds with 3 and 4 cycles of ultrasound. In conclusion, the agglomerated matrices present good macro and microstructural properties which would allow the design of food systems with better nutritional and functional properties.Keywords: ultrasonic agglomeration, physical properties of food, protein matrices, macro and microalgae
Procedia PDF Downloads 6112371 Optimizing Oil Production through 30-Inch Pipeline in Abu-Attifel Field
Authors: Ahmed Belgasem, Walid Ben Hussin, Emad Krekshi, Jamal Hashad
Abstract:
Waxy crude oil, characterized by its high paraffin wax content, poses significant challenges in the oil & gas industry due to its increased viscosity and semi-solid state at reduced temperatures. The wax formation process, which includes precipitation, crystallization, and deposition, becomes problematic when crude oil temperatures fall below the wax appearance temperature (WAT) or cloud point. Addressing these issues, this paper introduces a technical solution designed to mitigate the wax appearance and enhance the oil production process in Abu-Attifil Field via a 30-inch crude oil pipeline. A comprehensive flow assurance study validates the feasibility and performance of this solution across various production rates, temperatures, and operational scenarios. The study's findings indicate that maintaining the crude oil's temperature above a minimum threshold of 63°C is achievable through the strategic placement of two heating stations along the pipeline route. This approach effectively prevents wax deposition, gelling, and subsequent mobility complications, thereby bolstering the overall efficiency, reliability, safety, and economic viability of the production process. Moreover, this solution significantly curtails the environmental repercussions traditionally associated with wax deposition, which can accumulate up to 7,500kg. The research methodology involves a comprehensive flow assurance study to validate the feasibility and performance of the proposed solution. The study considers various production rates, temperatures, and operational scenarios. It includes crude oil analysis to determine the wax appearance temperature (WAT), as well as the evaluation and comparison of operating options for the heating stations. The study's findings indicate that the proposed solution effectively prevents wax deposition, gelling, and subsequent mobility complications. By maintaining the crude oil's temperature above the specified threshold, the solution improves the overall efficiency, reliability, safety, and economic viability of the oil production process. Additionally, the solution contributes to reducing environmental repercussions associated with wax deposition. The research conclusion presents a technical solution that optimizes oil production in the Abu-Attifil Field by addressing wax formation problems through the strategic placement of two heating stations. The solution effectively prevents wax deposition, improves overall operational efficiency, and contributes to environmental sustainability. Further research is suggested for field data validation and cost-benefit analysis exploration.Keywords: oil production, wax depositions, solar cells, heating stations
Procedia PDF Downloads 7312370 Vibration-Based Data-Driven Model for Road Health Monitoring
Authors: Guru Prakash, Revanth Dugalam
Abstract:
A road’s condition often deteriorates due to harsh loading such as overload due to trucks, and severe environmental conditions such as heavy rain, snow load, and cyclic loading. In absence of proper maintenance planning, this results in potholes, wide cracks, bumps, and increased roughness of roads. In this paper, a data-driven model will be developed to detect these damages using vibration and image signals. The key idea of the proposed methodology is that the road anomaly manifests in these signals, which can be detected by training a machine learning algorithm. The use of various machine learning techniques such as the support vector machine and Radom Forest method will be investigated. The proposed model will first be trained and tested with artificially simulated data, and the model architecture will be finalized by comparing the accuracies of various models. Once a model is fixed, the field study will be performed, and data will be collected. The field data will be used to validate the proposed model and to predict the future road’s health condition. The proposed will help to automate the road condition monitoring process, repair cost estimation, and maintenance planning process.Keywords: SVM, data-driven, road health monitoring, pot-hole
Procedia PDF Downloads 86