Search results for: process corner
12443 The Influence of Students’ Learning Factor and Parents’ Involvement in Their Learning and Suspension: The Application of Big Data Analysis of Internet of Things Technology
Authors: Chih Ming Kung
Abstract:
This study is an empirical study examining the enrollment rate and dropout rate of students from the perspectives of students’ learning, parents’ involvement and the learning process. Methods: Using the data collected from the entry website of Internet of Things (IoT), parents’ participation and the installation pattern of exit poll website, an investigation was conducted. Results: This study discovered that in the aspect of the degree of involvement, the attractiveness of courses, self-performance and departmental loyalty exerts significant influences on the four aspects: psychological benefits, physical benefits, social benefits and educational benefits of learning benefits. Parents’ participation also exerts a significant influence on the learning benefits. A suitable tool on the cloud was designed to collect the dynamic big data of students’ learning process. Conclusion: This research’s results can be valuable references for the government when making and promoting related policies, with more macro view and consideration. It is also expected to be contributory to schools for the practical study of promotion for enrollment.Keywords: students’ learning factor, parents’ involvement, involvement, technology
Procedia PDF Downloads 14712442 Fengqiao: An Ongoing Experiment with 'UrbanMemory' Theory in an Ancient Town and ItsDesign Experience
Authors: Yibei Ye, Lei Xu, Zhenyu Cao
Abstract:
Ancient town is a unique carrier of urban culture, maintaining the core culture of a region and continuing the urban context. Fengqiao, a nearly 2000-year-old town was on the brink of dilapidation in the past few decades. The town faced such problems as poor construction quality, environmental degeneration, inadequate open space, cultural characteristics and industry vitality. Therefore, the research upholds the principle of ‘organic renewal’ and puts forward three practical updated strategies which are ‘Repair Old as Ever,' ‘Activate Function’ and ‘Fill in with The New’. Also as a participant in updating the design, the author aims to ‘keep the memory of the history and see the development of the present’ as the goal of updating the design and regards the process of town renewal as the experimental venue for realizing this purpose. The research will sum up innovations on the designing process and the engineering progress in the past two years, and find out the innovation experiment and the effect of its implementation on the methodological level of the organic renewal design in Fengqiao ancient town. From here, we can also enjoy the very characteristic development trend presented by China in the design practice of the organic renewal in the ancient town.Keywords: characteristic town, Fengqiao, organic renewal, urban memory
Procedia PDF Downloads 16212441 Digital Technology Relevance in Archival and Digitising Practices in the Republic of South Africa
Authors: Tashinga Matindike
Abstract:
By means of definition, digital artworks encompass an array of artistic productions that are expressed in a technological form as an essential part of a creative process. Examples include illustrations, photos, videos, sculptures, and installations. Within the context of the visual arts, the process of repatriation involves the return of once-appropriated goods. Archiving denotes the preservation of a commodity for storage purposes in order to nurture its continuity. The aforementioned definitions form the foundation of the academic framework and premise of the argument, which is outlined in this paper. This paper aims to define, discuss and decipher the complexities involved in digitising artworks, whilst explaining the benefits of the process, particularly within the South African context, which is rich in tangible and intangible traditional cultural material, objects, and performances. With the internet having been introduced to the African Continent in the early 1990s, this new form of technology, in its own right, initiated a high degree of efficiency, which also resulted in the progressive transformation of computer-generated visual output. Subsequently, this caused a revolutionary influence on the manner in which technological software was developed and uterlised in art-making. Digital technology and the digitisation of creative processes then opened up new avenues of collating and recording information. One of the first visual artists to make use of digital technology software in his creative productions was United States-based artist John Whitney. His inventive work contributed greatly to the onset and development of digital animation. Comparable by technique and originality, South African contemporary visual artists who make digital artworks, both locally and internationally, include David Goldblatt, Katherine Bull, Fritha Langerman, David Masoga, Zinhle Sethebe, Alicia Mcfadzean, Ivan Van Der Walt, Siobhan Twomey, and Fhatuwani Mukheli. In conclusion, the main objective of this paper is to address the following questions: In which ways has the South African art community of visual artists made use of and benefited from technology, in its digital form, as a means to further advance creativity? What are the positive changes that have resulted in art production in South Africa since the onset and use of digital technological software? How has digitisation changed the manner in which we record, interpret, and archive both written and visual information? What is the role of South African art institutions in the development of digital technology and its use in the field of visual art. What role does digitisation play in the process of the repatriation of artworks and artefacts. The methodology in terms of the research process of this paper takes on a multifacted form, inclusive of data analysis of information attained by means of qualitative and quantitative approaches.Keywords: digital art, digitisation, technology, archiving, transformation and repatriation
Procedia PDF Downloads 5412440 Analyzing the Impact of the COVID-19 Pandemic on Clinicians’ Perceptions of Resuscitation and Escalation Decision-Making Processes: Cross-Sectional Survey of Hospital Clinicians in the United Kingdom
Authors: Michelle Hartanto, Risheka Suthantirakumar
Abstract:
Introduction Staff redeployment, increased numbers of acutely unwell patients requiring resuscitation decision-making conversations, visiting restrictions, and varying guidance regarding resuscitation for patients with COVID-19 disrupted clinicians’ management of resuscitation and escalation decision-making processes. While it was generally accepted that the COVID-19 pandemic disturbed numerous aspects of the Recommended Summary Plan for Emergency Care and Treatment (ReSPECT) process in the United Kingdom, a process which establishes a patient’s CPR status and treatment escalation plans, the impact of the pandemic on clinicians’ attitudes towards these resuscitation and decision-making conversations was unknown. This was the first study to examine the impact of the COVID-19 pandemic on clinicians’ knowledge, skills, and attitudes towards the ReSPECT process. Methods A cross-sectional survey of clinicians at one acute teaching hospital in the UK was conducted. A questionnaire with a defined five-point Likert scale was distributed and clinicians were asked to recall their pre-pandemic views on ReSPECT and report their current views at the time of survey distribution (May 2020, end of the first COVID-19 wave in the UK). Responses were received from 171 clinicians, and self-reported views before and during the pandemic were compared. Results Clinicians reported they found managing ReSPECT conversations more challenging during the pandemic, especially when conducted over the telephone with relatives, and they experienced an increase in negative emotions before, during, and after conducting ReSPECT conversations. Our findings identified that due to the pandemic there was now a need for clinicians to receive training and support in conducting resuscitation and escalation decision-making conversations over the telephone with relatives and managing these processes.Keywords: cardiopulmonary resuscitation, COVID-19 pandemic, DNACPR discussion, education, recommended summary plan for emergency care and treatment, resuscitation order
Procedia PDF Downloads 10312439 Flocculation on the Treatment of Olive Oil Mill Wastewater: Pre-Treatment
Authors: G. Hodaifa, J. A. Páez, C. Agabo, E. Ramos, J. C. Gutiérrez, A. Rosal
Abstract:
Currently, the continuous two-phase decanter process used for olive oil production is the more internationally widespread. The wastewaters generated from this industry (OMW) is a real environmental problem because of its high organic load. Among proposed treatments for these wastewaters, the advanced oxidation technologies (Fenton process, ozone, photoFenton, etc.) are the most favourable. The direct application of these processes is somewhat expensive. Therefore, the application of a previous stage based on a flocculation-sedimentation operation is of high importance. In this research five commercial flocculants (three cationic, and two anionic) have been used to achieve the separation of phases (liquid clarified-sludge). For each flocculant, different concentrations (0-1000 mg/L) have been studied. In these experiments, sludge volume formed over time and the final water quality were determined. The final removal percentages of total phenols (11.3-25.1%), COD (5.6-20.4%), total carbon (2.3-26.5%), total organic carbon (1.50-23.8%), total nitrogen (1.45-24.8%), and turbidity (27.9-61.4%) were obtained. Also, the variation on the electric conductivity reduction percentage (1-8%) was determined. Finally, the best flocculants with highest removal percentages have been determined (QG2001 and Flocudex CS49).Keywords: flocculants, flocculation, olive oil mill wastewater, water quality
Procedia PDF Downloads 54112438 Historical Development of Negative Emotive Intensifiers in Hungarian
Authors: Martina Katalin Szabó, Bernadett Lipóczi, Csenge Guba, István Uveges
Abstract:
In this study, an exhaustive analysis was carried out about the historical development of negative emotive intensifiers in the Hungarian language via NLP methods. Intensifiers are linguistic elements which modify or reinforce a variable character in the lexical unit they apply to. Therefore, intensifiers appear with other lexical items, such as adverbs, adjectives, verbs, infrequently with nouns. Due to the complexity of this phenomenon (set of sociolinguistic, semantic, and historical aspects), there are many lexical items which can operate as intensifiers. The group of intensifiers are admittedly one of the most rapidly changing elements in the language. From a linguistic point of view, particularly interesting are a special group of intensifiers, the so-called negative emotive intensifiers, that, on their own, without context, have semantic content that can be associated with negative emotion, but in particular cases, they may function as intensifiers (e.g.borzasztóanjó ’awfully good’, which means ’excellent’). Despite their special semantic features, negative emotive intensifiers are scarcely examined in literature based on large Historical corpora via NLP methods. In order to become better acquainted with trends over time concerning the intensifiers, The exhaustively analysed a specific historical corpus, namely the Magyar TörténetiSzövegtár (Hungarian Historical Corpus). This corpus (containing 3 millions text words) is a collection of texts of various genres and styles, produced between 1772 and 2010. Since the corpus consists of raw texts and does not contain any additional information about the language features of the data (such as stemming or morphological analysis), a large amount of manual work was required to process the data. Thus, based on a lexicon of negative emotive intensifiers compiled in a previous phase of the research, every occurrence of each intensifier was queried, and the results were stored in a separate data frame. Then, basic linguistic processing (POS-tagging, lemmatization etc.) was carried out automatically with the ‘magyarlanc’ NLP-toolkit. Finally, the frequency and collocation features of all the negative emotive words were automatically analyzed in the corpus. Outcomes of the research revealed in detail how these words have proceeded through grammaticalization over time, i.e., they change from lexical elements to grammatical ones, and they slowly go through a delexicalization process (their negative content diminishes over time). What is more, it was also pointed out which negative emotive intensifiers are at the same stage in this process in the same time period. Giving a closer look to the different domains of the analysed corpus, it also became certain that during this process, the pragmatic role’s importance increases: the newer use expresses the speaker's subjective, evaluative opinion at a certain level.Keywords: historical corpus analysis, historical linguistics, negative emotive intensifiers, semantic changes over time
Procedia PDF Downloads 23412437 Analysing Modern City Heritage through Modernization Transformation: A Case of Wuhan, China
Authors: Ziwei Guo, Liangping Hong, Zhiguo Ye
Abstract:
The exogenous modernization process in China and other late-coming countries, is not resulted from a gradual growth of their own modernity features, but a conscious response to external challenges. Under this context, it had been equally important for Chinese cities to make themselves ‘Chinese’ as well as ‘modern’. Wuhan was the first opened inland treaty port in late Qing Dynasty. In the following one hundred years, Wuhan transferred from a feudal town to a modern industrial city. It is a good example to illustrate the urban construction and cultural heritage through the process and impact of social transformation. An overall perspective on transformation will contribute to develop the city`s uniqueness and enhance its inclusive development. The study chooses the history of Wuhan from 1861 to 1957 as the study period. The whole transformation process will be divided into four typical periods based on key historical events, and the paper analyzes the changes on urban structure and constructions activities in each period. Then, a lot of examples are used to compare the features of Wuhan modern city heritage in the four periods. In this way, three characteristics of Wuhan modern city heritage are summarized. The paper finds that globalization and localization worked together to shape the urban physical space environment. For Wuhan, social transformation has a profound and comprehensive impact on urban construction, which can be analyzed in the aspects of main construction, architecture style, location and actors. Moreover, the three towns of Wuhan have a disparate cityscape that is reflected by the varied heritages and architecture features over different transformation periods. Lastly, the protection regulations and conservation planning of heritage in Wuhan are discussed, and suggestions about the conservation of Wuhan modern heritage are tried to be drawn. The implications of the study are providing a new perspective on modern city heritage for cities like Wuhan, and the future local planning system and heritage conservation policies can take into consideration the ‘Modern Cultural Transformation Route’ in this paper.Keywords: modern city heritage, transformation, identity, Wuhan
Procedia PDF Downloads 13412436 Treatment of Low-Grade Iron Ore Using Two Stage Wet High-Intensity Magnetic Separation Technique
Authors: Moses C. Siame, Kazutoshi Haga, Atsushi Shibayama
Abstract:
This study investigates the removal of silica, alumina and phosphorus as impurities from Sanje iron ore using wet high-intensity magnetic separation (WHIMS). Sanje iron ore contains low-grade hematite ore found in Nampundwe area of Zambia from which iron is to be used as the feed in the steelmaking process. The chemical composition analysis using X-ray Florence spectrometer showed that Sanje low-grade ore contains 48.90 mass% of hematite (Fe2O3) with 34.18 mass% as an iron grade. The ore also contains silica (SiO2) and alumina (Al2O3) of 31.10 mass% and 7.65 mass% respectively. The mineralogical analysis using X-ray diffraction spectrometer showed hematite and silica as the major mineral components of the ore while magnetite and alumina exist as minor mineral components. Mineral particle distribution analysis was done using scanning electron microscope with an X-ray energy dispersion spectrometry (SEM-EDS) and images showed that the average mineral size distribution of alumina-silicate gangue particles is in order of 100 μm and exists as iron-bearing interlocked particles. Magnetic separation was done using series L model 4 Magnetic Separator. The effect of various magnetic separation parameters such as magnetic flux density, particle size, and pulp density of the feed was studied during magnetic separation experiments. The ore with average particle size of 25 µm and pulp density of 2.5% was concentrated using pulp flow of 7 L/min. The results showed that 10 T was optimal magnetic flux density which enhanced the recovery of 93.08% of iron with 53.22 mass% grade. The gangue mineral particles containing 12 mass% silica and 3.94 mass% alumna remained in the concentrate, therefore the concentrate was further treated in the second stage WHIMS using the same parameters from the first stage. The second stage process recovered 83.41% of iron with 67.07 mass% grade. Silica was reduced to 2.14 mass% and alumina to 1.30 mass%. Accordingly, phosphorus was also reduced to 0.02 mass%. Therefore, the two stage magnetic separation process was established using these results.Keywords: Sanje iron ore, magnetic separation, silica, alumina, recovery
Procedia PDF Downloads 26012435 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.Keywords: apartment complex, big data, life-cycle building value analysis, machine learning
Procedia PDF Downloads 37512434 Rational Allocation of Resources in Water Infrastructure Development Projects
Authors: M. Macchiaroli, V. Pellecchia, L. Dolores
Abstract:
Within any European and world model of management of the integrated water service (in Italy only since 2012 is regulated by a national Authority, that is ARERA), a significant part is covered by the development of assets in terms of hydraulic networks and wastewater collection networks, including all their relative building works. The process of selecting the investments to be made starts from the preventive analysis of critical issues (water losses, unserved areas, low service standards, etc.) who occur in the managed territory of the Operator. Through the Program of Interventions (Provision by ARERA n. 580/2019/R/idr), the Operator provides to program the projects that can meet the emerged needs to determine the improvement of the water service levels. This phase (analyzed and solved by the author with a work published in 2019) involves the use of evaluation techniques (cost-benefit analysis, multi-criteria, and multi-objective techniques, neural networks, etc.) useful in selecting the most appropriate design answers to the different criticalities. However, at this point, the problem of establishing the time priorities between the various works deemed necessary remains open. That is, it is necessary to hierarchize the investments. In this decision-making moment, the interests of the private Operator are often opposed, which favors investments capable of generating high profitability, compared to those of the public controller (ARERA), which favors investments in greater social impact. In support of the concertation between these two actors, the protocol set out in the research has been developed, based on the AHP and capable of borrowing from the programmatic documents an orientation path for the settlement of the conflict. The protocol is applied to a case study of the Campania Region in Italy and has been professionally applied in the shared decision process between the manager and the local Authority.Keywords: analytic hierarchy process, decision making, economic evaluation of projects, integrated water service
Procedia PDF Downloads 12712433 On the Use of Reliability Factors to Reduce Conflict between Information Sources in Dempster-Shafer Theory
Authors: A. Alem, Y. Dahmani, A. Hadjali, A. Boualem
Abstract:
Managing the problem of the conflict, either by using the Dempster-Shafer theory, or by the application of the fusion process to push researchers in recent years to find ways to get to make best decisions especially; for information systems, vision, robotic and wireless sensor networks. In this paper we are interested to take account of the conflict in the combination step that took the conflict into account and tries to manage such a way that it does not influence the decision step, the conflict what from reliable sources. According to [1], the conflict lead to erroneous decisions in cases where was with strong degrees between sources of information, if the conflict is more than the maximum of the functions of belief mass K > max1...n (mi (A)), then the decision becomes impossible. We will demonstrate in this paper that the multiplication of mass functions by coefficients of reliability is a decreasing function; it leads to the reduction of conflict and a good decision. The definition of reliability coefficients accurately and multiply them by the mass functions of each information source to resolve the conflict and allow deciding whether the degree of conflict. The evaluation of this technique is done by a use case; a comparison of the combination of springs with a maximum conflict without, and with reliability coefficients.Keywords: Dempster-Shafer theory, fusion process, conflict managing, reliability factors, decision
Procedia PDF Downloads 42712432 Design an Development of an Agorithm for Prioritizing the Test Cases Using Neural Network as Classifier
Authors: Amit Verma, Simranjeet Kaur, Sandeep Kaur
Abstract:
Test Case Prioritization (TCP) has gained wide spread acceptance as it often results in good quality software free from defects. Due to the increase in rate of faults in software traditional techniques for prioritization results in increased cost and time. Main challenge in TCP is difficulty in manually validate the priorities of different test cases due to large size of test suites and no more emphasis are made to make the TCP process automate. The objective of this paper is to detect the priorities of different test cases using an artificial neural network which helps to predict the correct priorities with the help of back propagation algorithm. In our proposed work one such method is implemented in which priorities are assigned to different test cases based on their frequency. After assigning the priorities ANN predicts whether correct priority is assigned to every test case or not otherwise it generates the interrupt when wrong priority is assigned. In order to classify the different priority test cases classifiers are used. Proposed algorithm is very effective as it reduces the complexity with robust efficiency and makes the process automated to prioritize the test cases.Keywords: test case prioritization, classification, artificial neural networks, TF-IDF
Procedia PDF Downloads 39812431 Understanding Team Member Autonomy and Team Collaboration: A Qualitative Study
Authors: Ayşen Bakioğlu, Gökçen Seyra Çakır
Abstract:
This study aims to explore how research assistants who work in project teams experience team member autonomy and how they reconcile team member autonomy with team collaboration. The study utilizes snowball sampling. 20 research assistants who work the faculties of education in Marmara University and Yıldız Technical University have been interviewed. The analysis of data involves a content analysis MAXQDAPlus 11 which is a qualitative data analysis software is used as the data analysis tool. According to the findings of this study, emerging themes include team norm formation, team coordination management, the role of individual tasks in team collaboration, leadership distribution. According to the findings, interviewees experience team norm formation process in terms of processes, which pertain to task fulfillment, and processes, which pertain to the regulation of team dynamics. Team norm formation process instills a sense of responsibility amongst individual team members. Apart from that, the interviewees’ responses indicate that the realization of the obligation to work in a team contributes to the team norm formation process. The participants indicate that individual expectations are taken into consideration during the coordination of the team. The supervisor of the project team also has a crucial role in maintaining team collaboration. Coordination problems arise when an individual team member does not relate his/her academic field with the research topic of the project team. The findings indicate that the leadership distribution in the project teams involves two leadership processes: leadership distribution which is based on the processes that focus on individual team members and leadership distribution which is based on the processes that focus on team interaction. Apart from that, individual tasks serve as a facilitator of collaboration amongst team members. Interviewees also indicate that individual tasks also facilitate the expression of individuality.Keywords: project teams in higher education, research assistant teams, team collaboration, team member autonomy
Procedia PDF Downloads 36312430 The Asymptotic Hole Shape in Long Pulse Laser Drilling: The Influence of Multiple Reflections
Authors: Torsten Hermanns, You Wang, Stefan Janssen, Markus Niessen, Christoph Schoeler, Ulrich Thombansen, Wolfgang Schulz
Abstract:
In long pulse laser drilling of metals, it can be demonstrated that the ablation shape approaches a so-called asymptotic shape such that it changes only slightly or not at all with further irradiation. These findings are already known from ultra short pulse (USP) ablation of dielectric and semiconducting materials. The explanation for the occurrence of an asymptotic shape in long pulse drilling of metals is identified, a model for the description of the asymptotic hole shape numerically implemented, tested and clearly confirmed by comparison with experimental data. The model assumes a robust process in that way that the characteristics of the melt flow inside the arising melt film does not change qualitatively by changing the laser or processing parameters. Only robust processes are technically controllable and thus of industrial interest. The condition for a robust process is identified by a threshold for the mass flow density of the assist gas at the hole entrance which has to be exceeded. Within a robust process regime the melt flow characteristics can be captured by only one model parameter, namely the intensity threshold. In analogy to USP ablation (where it is already known for a long time that the resulting hole shape results from a threshold for the absorbed laser fluency) it is demonstrated that in the case of robust long pulse ablation the asymptotic shape forms in that way that along the whole contour the absorbed heat flux density is equal to the intensity threshold. The intensity threshold depends on the special material and radiation properties and has to be calibrated be one reference experiment. The model is implemented in a numerical simulation which is called AsymptoticDrill and requires such a few amount of resources that it can run on common desktop PCs, laptops or even smart devices. Resulting hole shapes can be calculated within seconds what depicts a clear advantage over other simulations presented in literature in the context of industrial every day usage. Against this background the software additionally is equipped with a user-friendly GUI which allows an intuitive usage. Individual parameters can be adjusted using sliders while the simulation result appears immediately in an adjacent window. A platform independent development allow a flexible usage: the operator can use the tool to adjust the process in a very convenient manner on a tablet during the developer can execute the tool in his office in order to design new processes. Furthermore, at the best knowledge of the authors AsymptoticDrill is the first simulation which allows the import of measured real beam distributions and thus calculates the asymptotic hole shape on the basis of the real state of the specific manufacturing system. In this paper the emphasis is placed on the investigation of the effect of multiple reflections on the asymptotic hole shape which gain in importance when drilling holes with large aspect ratios.Keywords: asymptotic hole shape, intensity threshold, long pulse laser drilling, robust process
Procedia PDF Downloads 21512429 Early Recognition and Grading of Cataract Using a Combined Log Gabor/Discrete Wavelet Transform with ANN and SVM
Authors: Hadeer R. M. Tawfik, Rania A. K. Birry, Amani A. Saad
Abstract:
Eyes are considered to be the most sensitive and important organ for human being. Thus, any eye disorder will affect the patient in all aspects of life. Cataract is one of those eye disorders that lead to blindness if not treated correctly and quickly. This paper demonstrates a model for automatic detection, classification, and grading of cataracts based on image processing techniques and artificial intelligence. The proposed system is developed to ease the cataract diagnosis process for both ophthalmologists and patients. The wavelet transform combined with 2D Log Gabor Wavelet transform was used as feature extraction techniques for a dataset of 120 eye images followed by a classification process that classified the image set into three classes; normal, early, and advanced stage. A comparison between the two used classifiers, the support vector machine SVM and the artificial neural network ANN were done for the same dataset of 120 eye images. It was concluded that SVM gave better results than ANN. SVM success rate result was 96.8% accuracy where ANN success rate result was 92.3% accuracy.Keywords: cataract, classification, detection, feature extraction, grading, log-gabor, neural networks, support vector machines, wavelet
Procedia PDF Downloads 33612428 Mango (Mangifera indica L.) Lyophilization Using Vacuum-Induced Freezing
Authors: Natalia A. Salazar, Erika K. Méndez, Catalina Álvarez, Carlos E. Orrego
Abstract:
Lyophilization, also called freeze-drying, is an important dehydration technique mainly used for pharmaceuticals. Food industry also uses lyophilization when it is important to retain most of the nutritional quality, taste, shape and size of dried products and to extend their shelf life. Vacuum-Induced during freezing cycle (VI) has been used in order to control ice nucleation and, consequently, to reduce the time of primary drying cycle of pharmaceuticals preserving quality properties of the final product. This procedure has not been applied in freeze drying of foods. The present work aims to investigate the effect of VI on the lyophilization drying time, final moisture content, density and reconstitutional properties of mango (Mangifera indica L.) slices (MS) and mango pulp-maltodextrin dispersions (MPM) (30% concentration of total solids). Control samples were run at each freezing rate without using induced vacuum. The lyophilization endpoint was the same for all treatments (constant difference between capacitance and Pirani vacuum gauges). From the experimental results it can be concluded that at the high freezing rate (0.4°C/min) reduced the overall process time up to 30% comparing process time required for the control and VI of the lower freeze rate (0.1°C/min) without affecting the quality characteristics of the dried product, which yields a reduction in costs and energy consumption for MS and MPM freeze drying. Controls and samples treated with VI at freezing rate of 0.4°C/min in MS showed similar results in moisture and density parameters. Furthermore, results from MPM dispersion showed favorable values when VI was applied because dried product with low moisture content and low density was obtained at shorter process time compared with the control. There were not found significant differences between reconstitutional properties (rehydration for MS and solubility for MPM) of freeze dried mango resulting from controls, and VI treatments.Keywords: drying time, lyophilization, mango, vacuum induced freezing
Procedia PDF Downloads 41112427 Study and Improvement of the Quality of a Production Line
Authors: S. Bouchami, M.N. Lakhoua
Abstract:
The automotive market is a dynamic market that continues to grow. That’s why several companies belonging to this sector adopt a quality improvement approach. Wanting to be competitive and successful in the environment in which they operate, these companies are dedicated to establishing a system of quality management to ensure the achievement of the objective quality, improving the products and process as well as the satisfaction of the customers. In this paper, the management of the quality and the improvement of a production line in an industrial company is presented. In fact, the project is divided into two essential parts: the creation of the technical line documentation and the quality assurance documentation and the resolution of defects at the line, as well as those claimed by the customer. The creation of the documents has required a deep understanding of the manufacturing process. The analysis and problem solving were done through the implementation of PDCA (Plan Do Check Act) and FTA (Fault Tree Analysis). As perspective, in order to better optimize production and improve the efficiency of the production line, a study on the problems associated with the supply of raw materials should be made to solve the problems of stock-outs which cause delays penalizing for the industrial company.Keywords: quality management, documentary system, Plan Do Check Act (PDCA), fault tree analysis (FTA) method
Procedia PDF Downloads 14412426 “Octopub”: Geographical Sentiment Analysis Using Named Entity Recognition from Social Networks for Geo-Targeted Billboard Advertising
Authors: Oussama Hafferssas, Hiba Benyahia, Amina Madani, Nassima Zeriri
Abstract:
Although data nowadays has multiple forms; from text to images, and from audio to videos, yet text is still the most used one at a public level. At an academical and research level, and unlike other forms, text can be considered as the easiest form to process. Therefore, a brunch of Data Mining researches has been always under its shadow, called "Text Mining". Its concept is just like data mining’s, finding valuable patterns in data, from large collections and tremendous volumes of data, in this case: Text. Named entity recognition (NER) is one of Text Mining’s disciplines, it aims to extract and classify references such as proper names, locations, expressions of time and dates, organizations and more in a given text. Our approach "Octopub" does not aim to find new ways to improve named entity recognition process, rather than that it’s about finding a new, and yet smart way, to use NER in a way that we can extract sentiments of millions of people using Social Networks as a limitless information source, and Marketing for product promotion as the main domain of application.Keywords: textmining, named entity recognition(NER), sentiment analysis, social media networks (SN, SMN), business intelligence(BI), marketing
Procedia PDF Downloads 59012425 Biochar as a Strong Adsorbent for Multiple-Metal Removal from Contaminated Water
Authors: Eman H. El-Gamal, Mai E. Khedr, Randa Ghonim, Mohamed Rashad
Abstract:
In the past few years, biochar - a highly carbon-rich material produced from agro-wastes by pyrolysis process - was used as an effective adsorbent for heavy metals removal from polluted water. In this study, different types of biochar (rice straw 'RSB', corn cob 'CCB', and Jatropha shell 'JSB' were used to evaluate the adsorption capacity of heavy metals removal from multiple-metal solutions (Cu, Mn, Zn, and Cd). Kinetics modeling has been examined to illustrate potential adsorption mechanisms. The results showed that the potential removal of metal is dependent on the metal and biochar types. The adsorption capacity of the biochars followed the order: RSB > JSB > CCB. In general, RSB and JSB biochars presented high potential removal of heavy metals from polluted water, which was higher than 90 and 80% after 2 hrs of contact time for all metals, respectively. According to the kinetics data, the pseudo-second-order model was agreed strongly with Cu, Mn, Zn, and Cd adsorption onto the biochars (R2 ≥ 0.97), indicating the dominance of specific adsorption process, i.e., chemisorption. In conclusion, this study revealed that RSB and JSB biochar have the potential to be a strong adsorbent for multiple-metal removal from wastewater.Keywords: adsorption, biochar, chemisorption, polluted water
Procedia PDF Downloads 15112424 From Binary Solutions to Real Bio-Oils: A Multi-Step Extraction Story of Phenolic Compounds with Ionic Liquid
Authors: L. Cesari, L. Canabady-Rochelle, F. Mutelet
Abstract:
The thermal conversion of lignin produces bio-oils that contain many compounds with high added-value such as phenolic compounds. In order to efficiently extract these compounds, the possible use of choline bis(trifluoromethylsulfonyl)imide [Choline][NTf2] ionic liquid was explored. To this end, a multistep approach was implemented. First, binary (phenolic compound and solvent) and ternary (phenolic compound and solvent and ionic liquid) solutions were investigated. Eight binary systems of phenolic compound and water were investigated at atmospheric pressure. These systems were quantified using the turbidity method and UV-spectroscopy. Ternary systems (phenolic compound and water and [Choline][NTf2]) were investigated at room temperature and atmospheric pressure. After stirring, the solutions were let to settle down, and a sample of each phase was collected. The analysis of the phases was performed using gas chromatography with an internal standard. These results were used to quantify the values of the interaction parameters of thermodynamic models. Then, extractions were performed on synthetic solutions to determine the influence of several operating conditions (temperature, kinetics, amount of [Choline][NTf2]). With this knowledge, it has been possible to design and simulate an extraction process composed of one extraction column and one flash. Finally, the extraction efficiency of [Choline][NTf2] was quantified with real bio-oils from lignin pyrolysis. Qualitative and quantitative analysis were performed using gas chromatographic connected to mass spectroscopy and flame ionization detector. The experimental measurements show that the extraction of phenolic compounds is efficient at room temperature, quick and does not require a high amount of [Choline][NTf2]. Moreover, the simulations of the extraction process demonstrate that [Choline][NTf2] process requires less energy than an organic one. Finally, the efficiency of [Choline][NTf2] was confirmed in real situations with the experiments on lignin pyrolysis bio-oils.Keywords: bio-oils, extraction, lignin, phenolic compounds
Procedia PDF Downloads 11112423 Bayes Estimation of Parameters of Binomial Type Rayleigh Class Software Reliability Growth Model using Non-informative Priors
Authors: Rajesh Singh, Kailash Kale
Abstract:
In this paper, the Binomial process type occurrence of software failures is considered and failure intensity has been characterized by one parameter Rayleigh class Software Reliability Growth Model (SRGM). The proposed SRGM is mathematical function of parameters namely; total number of failures i.e. η-0 and scale parameter i.e. η-1. It is assumed that very little or no information is available about both these parameters and then considering non-informative priors for both these parameters, the Bayes estimators for the parameters η-0 and η-1 have been obtained under square error loss function. The proposed Bayes estimators are compared with their corresponding maximum likelihood estimators on the basis of risk efficiencies obtained by Monte Carlo simulation technique. It is concluded that both the proposed Bayes estimators of total number of failures and scale parameter perform well for proper choice of execution time.Keywords: binomial process, non-informative prior, maximum likelihood estimator (MLE), rayleigh class, software reliability growth model (SRGM)
Procedia PDF Downloads 39012422 The Use of a Miniature Bioreactor as Research Tool for Biotechnology Process Development
Authors: Muhammad Zainuddin Arriafdi, Hamudah Hakimah Abdullah, Mohd Helmi Sani, Wan Azlina Ahmad, Muhd Nazrul Hisham Zainal Alam
Abstract:
The biotechnology process development demands numerous experimental works. In laboratory environment, this is typically carried out using a shake flask platform. This paper presents the design and fabrication of a miniature bioreactor system as an alternative research tool for bioprocessing. The working volume of the reactor is 100 ml, and it is made of plastic. The main features of the reactor included stirring control, temperature control via the electrical heater, aeration strategy through a miniature air compressor, and online optical cell density (OD) sensing. All sensors and actuators integrated into the reactor was controlled using an Arduino microcontroller platform. In order to demonstrate the functionality of such miniature bioreactor concept, series of batch Saccharomyces cerevisiae fermentation experiments were performed under various glucose concentrations. Results attained from the fermentation experiments were utilized to solve the Monod equation constants, namely the saturation constant, Ks, and cells maximum growth rate, μmax as to further highlight the usefulness of the device. The mixing capacity of the reactor was also evaluated. It was found that the results attained from the miniature bioreactor prototype were comparable to results achieved using a shake flask. The unique features of the device as compared to shake flask platform is that the reactor mixing condition is much more comparable to a lab-scale bioreactor setup. The prototype is also integrated with an online OD sensor, and as such, no sampling was needed to monitor the progress of the reaction performed. Operating cost and medium consumption are also low and thus, making it much more economical to be utilized for biotechnology process development compared to lab-scale bioreactors.Keywords: biotechnology, miniature bioreactor, research tools, Saccharomyces cerevisiae
Procedia PDF Downloads 11812421 Classification of Random Doppler-Radar Targets during the Surveillance Operations
Authors: G. C. Tikkiwal, Mukesh Upadhyay
Abstract:
During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving the army, moving convoys etc. The radar operator selects one of the promising targets into single target tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper, we present a technique using mathematical and statistical methods like fast fourier transformation (FFT) and principal component analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.Keywords: radar target, FFT, principal component analysis, eigenvector, octave-notes, DSP
Procedia PDF Downloads 39412420 Ergonomics Management and Sustainability: An Exploratory Study Applied to Automaker Industry in South of Brazil
Authors: Giles Balbinotti, Lucas Balbinotti, Paula Hembecker
Abstract:
The management of the productive process project activities, for the conception of future work and for the financial health of the companies, is an important condition in an organizational model that corroborates the management of the human aspects and their variabilities existing in the work. It is important to seek, at all levels of the organization, understanding and consequent cultural change, and so that factors associated with human aspects are considered and prioritized in the projects. In this scenario, the central question of research for this study is placed from the context of the work, in which the managers and project coordinators are inserted, as follows: How is the top management convinced, in the design stages, to take The ‘Ergonomics’ as strategy for the performance and sustainability of the business? In this perspective, this research has as general objective to analyze how the application of the management of the human aspects in a real project of productive process in the automotive industry, including the activity of the manager and coordinator of the project beyond the strategies of convincing to act in the ergonomics of design. For this, the socio-technical and ergonomic approach is adopted, given its anthropocentric premise in the sense of acting on the social system simultaneously to the technical system, besides the support of the Modapts system that measures the non-value-added times and the correlation with the Critical positions. The methodological approach adopted in this study is based on a review of the literature and the analysis of the activity of the project coordinators of an industry, including the management of human aspects in the context of work variability and the strategies applied in project activities. It was observed in the study that the loss of performance of the serial production lines reaches the important number of the order of 30%, which can make the operation with not value-added, and this loss has as one of the causes, the ergonomic problems present in the professional activity.Keywords: human aspects in production process project, ergonomics in design, sociotechnical project management, sociotechnical, ergonomic principles, sustainability
Procedia PDF Downloads 25312419 Association of Selected Polymorphisms of BER Pathway with the Risk of Colorectal Cancer in the Polish Population
Authors: Jacek Kabzinski, Karolina Przybylowska, Lukasz Dziki, Adam Dziki, Ireneusz Majsterek
Abstract:
The incidence of colorectal cancer (CRC) is increasing from year to year. Despite intensive research CRC etiology remains unknown. Studies suggest that at the basis of the process of carcinogenesis can lie reduced efficiency of DNA repair mechanisms, often caused by polymorphisms in DNA repair genes. The aim of the study was to determine the relationship between gene polymorphisms Pro242Arg of PolB gene and Arg780His of Lig3 gene and modulation of the risk of colorectal cancer in the Polish population. Determination of the molecular basis of carcinogenesis process and predicting increased risk will allow qualifying patients to increased risk group and including them in preventive program. We used blood collected from 110 patients diagnosed with colorectal cancer. The control group consisted of equal number of healthy people. Genotyping was performed by TaqMan method. The obtained results indicate that the genotype 780Arg/His of Lig3 gene is associated with an increased risk of colorectal cancer. On the basis of these results, we conclude that Lig3 gene polymorphism Arg780His may be associated with an increased risk of colorectal cancer.Keywords: BER, colorectal cancer, PolB, Lig3, polymorphisms
Procedia PDF Downloads 45512418 Procedure for Monitoring the Process of Behavior of Thermal Cracking in Concrete Gravity Dams: A Case Study
Authors: Adriana de Paula Lacerda Santos, Bruna Godke, Mauro Lacerda Santos Filho
Abstract:
Several dams in the world have already collapsed, causing environmental, social and economic damage. The concern to avoid future disasters has stimulated the creation of a great number of laws and rules in many countries. In Brazil, Law 12.334/2010 was created, which establishes the National Policy on Dam Safety. Overall, this policy requires the dam owners to invest in the maintenance of their structures and to improve its monitoring systems in order to provide faster and straightforward responses in the case of an increase of risks. As monitoring tools, visual inspections has provides comprehensive assessment of the structures performance, while auscultation’s instrumentation has added specific information on operational or behavioral changes, providing an alarm when a performance indicator exceeds the acceptable limits. These limits can be set using statistical methods based on the relationship between instruments measures and other variables, such as reservoir level, time of the year or others instruments measuring. Besides the design parameters (uplift of the foundation, displacements, etc.) the dam instrumentation can also be used to monitor the behavior of defects and damage manifestations. Specifically in concrete gravity dams, one of the main causes for the appearance of cracks, are the concrete volumetric changes generated by the thermal origin phenomena, which are associated with the construction process of these structures. Based on this, the goal of this research is to propose a monitoring process of the thermal cracking behavior in concrete gravity dams, through the instrumentation data analysis and the establishment of control values. Therefore, as a case study was selected the Block B-11 of José Richa Governor Dam Power Plant, that presents a cracking process, which was identified even before filling the reservoir in August’ 1998, and where crack meters and surface thermometers were installed for its monitoring. Although these instruments were installed in May 2004, the research was restricted to study the last 4.5 years (June 2010 to November 2014), when all the instruments were calibrated and producing reliable data. The adopted method is based on simple linear correlations procedures to understand the interactions among the instruments time series, verifying the response times between them. The scatter plots were drafted from the best correlations, which supported the definition of the limit control values. Among the conclusions, it is shown that there is a strong or very strong correlation between ambient temperature and the crack meters and flowmeters measurements. Based on the results of the statistical analysis, it was possible to develop a tool for monitoring the behavior of the case study cracks. Thus it was fulfilled the goal of the research to develop a proposal for a monitoring process of the behavior of thermal cracking in concrete gravity dams.Keywords: concrete gravity dam, dams safety, instrumentation, simple linear correlation
Procedia PDF Downloads 29212417 Reduction Behavior of Medium Grade Manganese Ore from Karangnunggal during a Sintering Process in Methane Gas
Authors: H. Aripin, I. Made Joni, Edvin Priatna, Nundang Busaeri, Svilen Sabchevski
Abstract:
In this investigation, manganese has been produced from medium grade manganese ore from Karangnunggal mine (West Java, Indonesia). The ores were grinded using a jar mill to pass through a 150 mesh sieve. The effects of keeping it at a temperature of 1200 °C in methane gas on the structural properties have been studied. The material’s properties have been characterized on the basis of the experimental data obtained using X-ray fluorescence (XRF), X-ray diffraction (XRD), Scanning Electron Microscopy (SEM), and Fourier transform infrared (FTIR) spectroscopy. It has been found that the ore contains MnO₂ as the main constituents at about 46.80 wt.%. It can be also observed that the ore particles are agglomerated forming dense grains with different texture and morphology. The irregular-shaped grains with dark contrast, the large brighter grains, and smaller grains with bright texture and smooth surfaces are associated with the presence of manganese, calcium, and quartz, respectively. From XRD patterns, MnO₂ is reduced to hausmannite (Mn₃O₄), manganosite (MnO) and manganese carbide (Mn₇C₃). At a temperature of 1200°C the keeping time does not have any effect on the formation of crystals and the crystalline phases remain almost unchanged in the time range from 15 to 90 minutes. An increase of the keeping time up to 45 minutes during the sintering process leads to an increase of the MnO concentration, while at 90 minutes, the concentration decreases. At longer keeping times the excess reaction of the methane gas and manganese oxide in the ore causes an increase of carbon deposition. As a result, it blocks the particle surface and then hinders the reduction process of manganese oxide. From FTIR spectrum allows one to explain that the appearance of C=O stretching mode arises from absorption of atmospheric methane and manganese oxide of the ore. The intensity of this band increases with increasing the keeping time, indicating an increase of carbon deposition on the surface of manganese oxide.Keywords: manganese, medium grade manganese ore, structural properties, keeping the temperature, carbon deposition
Procedia PDF Downloads 15712416 The Implementation of the Javanese Lettered-Manuscript Image Preprocessing Stage Model on the Batak Lettered-Manuscript Image
Authors: Anastasia Rita Widiarti, Agus Harjoko, Marsono, Sri Hartati
Abstract:
This paper presents the results of a study to test whether the Javanese character manuscript image preprocessing model that have been more widely applied, can also be applied to segment of the Batak characters manuscripts. The treatment process begins by converting the input image into a binary image. After the binary image is cleaned of noise, then the segmentation lines using projection profile is conducted. If unclear histogram projection is found, then the smoothing process before production indexes line segments is conducted. For each line image which has been produced, then the segmentation scripts in the line is applied, with regard of the connectivity between pixels which making up the letters that there is no characters are truncated. From the results of manuscript preprocessing system prototype testing, it is obtained the information about the system truth percentage value on pieces of Pustaka Batak Podani Ma AjiMamisinon manuscript ranged from 65% to 87.68% with a confidence level of 95%. The value indicates the truth percentage shown the initial processing model in Javanese characters manuscript image can be applied also to the image of the Batak characters manuscript.Keywords: connected component, preprocessing, manuscript image, projection profiles
Procedia PDF Downloads 40112415 An Internet of Things-Based Weight Monitoring System for Honey
Authors: Zheng-Yan Ruan, Chien-Hao Wang, Hong-Jen Lin, Chien-Peng Huang, Ying-Hao Chen, En-Cheng Yang, Chwan-Lu Tseng, Joe-Air Jiang
Abstract:
Bees play a vital role in pollination. This paper focuses on the weighing process of honey. Honey is usually stored at the comb in a hive. Bee farmers brush bees away from the comb and then collect honey, and the collected honey is weighed afterward. However, such a process brings strong negative influences on bees and even leads to the death of bees. This paper therefore presents an Internet of Things-based weight monitoring system which uses weight sensors to measure the weight of honey and simplifies the whole weighing procedure. To verify the system, the weight measured by the system is compared to the weight of standard weights used for calibration by employing a linear regression model. The R2 of the regression model is 0.9788, which suggests that the weighing system is highly reliable and is able to be applied to obtain actual weight of honey. In the future, the weight data of honey can be used to find the relationship between honey production and different ecological parameters, such as bees’ foraging behavior and weather conditions. It is expected that the findings can serve as critical information for honey production improvement.Keywords: internet of things, weight, honey, bee
Procedia PDF Downloads 45912414 Effects of Convective Momentum Transport on the Cyclones Intensity: A Case Study
Authors: José Davi Oliveira De Moura, Chou Sin Chan
Abstract:
In this study, the effect of convective momentum transport (CMT) on the life of cyclone systems and their organization is analyzed. A case of strong precipitation, in the southeast of Brazil, was simulated using Eta model with two kinds of convective parameterization: Kain-Fritsch without CMT and Kain-fritsch with CMT. Reanalysis data from CFSR were used to compare Eta model simulations. The Wind, mean sea level pressure, rain and temperature are included in analysis. The rain was evaluated by Equitable Threat Score (ETS) and Bias Index; the simulations were compared among themselves to detect the influence of CMT displacement on the systems. The result shows that CMT process decreases the intensity of meso cyclones (higher pressure values on nuclei) and change the positions and production of rain. The decrease of intensity in meso cyclones should be caused by the dissolution of momentum from lower levels from up levels. The rain production and rain distribution were altered because the displacement of the larger systems scales was changed. In addition, the inclusion of CMT process is very important to improve the simulation of life time of meteorological systems.Keywords: convection, Kain-Fritsch, momentum, parameterization
Procedia PDF Downloads 325