Search results for: seafood processing environment
11810 The Output Fallacy: An Investigation into Input, Noticing, and Learners’ Mechanisms
Authors: Samantha Rix
Abstract:
The purpose of this research paper is to investigate the cognitive processing of learners who receive input but produce very little or no output, and who, when they do produce output, exhibit a similar language proficiency as do those learners who produced output more regularly in the language classroom. Previous studies have investigated the benefits of output (with somewhat differing results); therefore, the presentation will begin with an investigation of what may underlie gains in proficiency without output. Consequently, a pilot study was designed and conducted to gain insight into the cognitive processing of low-output language learners looking, for example, at quantity and quality of noticing. This will be carried out within the paradigm of action classroom research, observing and interviewing low-output language learners in an intensive English program at a small Midwest university. The results of the pilot study indicated that autonomy in language learning, specifically utilizing strategies such self-monitoring, self-talk, and thinking 'out-loud', were crucial in the development of language proficiency for academic-level performance. The presentation concludes with an examination of pedagogical implication for classroom use in order to aide students in their language development.Keywords: cognitive processing, language learners, language proficiency, learning strategies
Procedia PDF Downloads 47511809 Radiation Usage Impact of on Anti-Nutritional Compounds (Antitrypsin and Phytic Acid) of Livestock and Poultry Foods
Authors: Mohammad Khosravi, Ali Kiani, Behroz Dastar, Parvin Showrang
Abstract:
Review was carried out on important anti-nutritional compounds of livestock and poultry foods and the effect of radiation usage. Nowadays, with advancement in technology, different methods have been considered for the optimum usage of nutrients in livestock and poultry foods. Steaming, extruding, pelleting, and the use of chemicals are the most common and popular methods in food processing. Use of radiation in food processing researches in the livestock and poultry industry is currently highly regarded. Ionizing (electrons, gamma) and non-ionizing beams (microwave and infrared) are the most useable rays in animal food processing. In recent researches, these beams have been used to remove and reduce the anti-nutritional factors and microbial contamination and improve the digestibility of nutrients in poultry and livestock food. The evidence presented will help researchers to recognize techniques of relevance to them. Simplification of some of these techniques, especially in developing countries, must be addressed so that they can be used more widely.Keywords: antitrypsin, gamma anti-nutritional components, phytic acid, radiation
Procedia PDF Downloads 34311808 An Evaluation of Kahoot Application and Its Environment as a Learning Tool
Authors: Muhammad Yasir Babar, Ebrahim Panah
Abstract:
Over the past 20 years, internet has seen continual advancement and with the advent of online technology, various types of web-based games have been developed. Games are frequently being used among different age groups from baby boomers to generation Z. Games are not only used for entertainment but also utilized as a learning approach transmitting education to a level that is more interesting and effective for students. One of the popular web-based education games is Kahoot with growing popularity and usage, which is being used in different fields of studies. However, little knowledge is available on university students’ perception of Kahoot environment and application for learning subjects. Hence, the objective of the current study is to investigate students’ perceptions of Kahoot application and environment as a learning tool. The study employed a survey approach by distributing Google Forms –created questionnaire, with high level of reliability index, to 62 students (11 males and 51 females). The findings show that students have positive attitudes towards Kahoot application and its environment for learning. Regarding Kahoot application, it was indicated that activities created using Kahoot are more interesting for students, Kahoot is useful for collaborative learning, and Kahoot enhances interest in learning lesson. In terms of Kahoot environment, it was found that using this application through mobile is easy for students, its design is simple and useful, Kahoot-created activities can easily be shared, and the application can easily be used on any platform. The findings of the study have implications for instructors, policymakers and curriculum developers.Keywords: application, environment, Kahoot, learning tool
Procedia PDF Downloads 13311807 Analyzing the Risk Based Approach in General Data Protection Regulation: Basic Challenges Connected with Adapting the Regulation
Authors: Natalia Kalinowska
Abstract:
The adoption of the General Data Protection Regulation, (GDPR) finished the four-year work of the European Commission in this area in the European Union. Considering far-reaching changes, which will be applied by GDPR, the European legislator envisaged two-year transitional period. Member states and companies have to prepare for a new regulation until 25 of May 2018. The idea, which becomes a new look at an attitude to data protection in the European Union is risk-based approach. So far, as a result of implementation of Directive 95/46/WE, in many European countries (including Poland) there have been adopted very particular regulations, specifying technical and organisational security measures e.g. Polish implementing rules indicate even how long password should be. According to the new approach from May 2018, controllers and processors will be obliged to apply security measures adequate to level of risk associated with specific data processing. The risk in GDPR should be interpreted as the likelihood of a breach of the rights and freedoms of the data subject. According to Recital 76, the likelihood and severity of the risk to the rights and freedoms of the data subject should be determined by reference to the nature, scope, context and purposes of the processing. GDPR does not indicate security measures which should be applied – in recitals there are only examples such as anonymization or encryption. It depends on a controller’s decision what type of security measures controller considered as sufficient and he will be responsible if these measures are not sufficient or if his identification of risk level is incorrect. Data protection regulation indicates few levels of risk. Recital 76 indicates risk and high risk, but some lawyers think, that there is one more category – low risk/now risk. Low risk/now risk data processing is a situation when it is unlikely to result in a risk to the rights and freedoms of natural persons. GDPR mentions types of data processing when a controller does not have to evaluate level of risk because it has been classified as „high risk” processing e.g. processing on a large scale of special categories of data, processing with using new technologies. The methodology will include analysis of legal regulations e.g. GDPR, the Polish Act on the Protection of personal data. Moreover: ICO Guidelines and articles concerning risk based approach in GDPR. The main conclusion is that an appropriate risk assessment is a key to keeping data safe and avoiding financial penalties. On the one hand, this approach seems to be more equitable, not only for controllers or processors but also for data subjects, but on the other hand, it increases controllers’ uncertainties in the assessment which could have a direct impact on incorrect data protection and potential responsibility for infringement of regulation.Keywords: general data protection regulation, personal data protection, privacy protection, risk based approach
Procedia PDF Downloads 25211806 Knowledge Management in a Combined/Joint Environment
Authors: Cory Cannon
Abstract:
In the current era of shrinking budgets, increasing amounts of worldwide natural disasters, state and non-state initiated conflicts within the world. The response has involved multinational coalitions to conduct effective military operations. The need for a Knowledge Management strategy when developing these coalitions have been overlooked in the past and the need for developing these accords early on will save time and help shape the way information and knowledge are transferred from the staff and action officers of the coalition to the decision-makers in order to make timely decisions within an ever changing environment. The aim of this paper is to show how Knowledge Management has developed within the United States military and how the transformation of working within a Combined/ Joint environment in both the Middle East and the Far East has improved relations between members of the coalitions as well as being more effective as a military force. These same principles could be applied to multinational corporations when dealing with cultures and decision-making processes.Keywords: civil-military, culture, joint environment, knowledge management
Procedia PDF Downloads 36411805 From Restraint to Obligation: The Protection of the Environment in Times of Armed Conflict
Authors: Aaron Walayat
Abstract:
Protection of the environment in international law has been one of the most developed in the context of international humanitarian law. This paper examines the history of the protection of the environment in times of armed conflict, beginning with the traditional notion of restraint observed in antiquity towards the obligation to protect the environment, examining the treaties and agreements, both binding and non-binding which have contributed to environmental protection in war. The paper begins with a discussion of the ancient concept of restraint. This section examines the social norms in favor of protection of the environment as observed in the Bible, Greco-Roman mythology, and even more contemporary literature. The study of the traditional rejection of total war establishes the social foundation on which the current legal regime has stemmed. The paper then studies the principle of restraint as codified in international humanitarian law. It mainly examines Additional Protocol I of the Geneva Convention of 1949 and existing international law concerning civilian objects and the principles of international humanitarian law in the classification between civilian objects and military objectives. The paper then explores the environment’s classification as both a military objective and as a civilian object as well as explores arguments in favor of the classification of the whole environment as a civilian object. The paper will then discuss the current legal regime surrounding the protection of the environment, discussing some declarations and conventions including the 1868 Declaration of St. Petersburg, the 1907 Hague Convention No. IV, the Geneva Conventions, and the 1976 Environmental Modification Convention. The paper concludes with the outline noting the movement from codification of the principles of restraint into the various treaties, agreements, and declarations of the current regime of international humanitarian law. This paper provides an analysis of the history and significance of the relationship between international humanitarian law as a major contributor to the growing field of international environmental law.Keywords: armed conflict, environment, legal regime, restraint
Procedia PDF Downloads 20411804 The Constitutional Rights of a Child to a Clean and Healthy Environment: A Case Study in the Vaal Triangle Region
Authors: Christiena Van Der Bank, Marjone Van Der Bank, Ronelle Prinsloo
Abstract:
The constitutional right to a healthy environment and the constitutional duty imposed on the state actively to protect the environment fulfill the specific duties to prevent pollution and ecological degradation and to promote conservation. The aim of this paper is to draw attention to the relationship between child rights and the environment. The focus is to analyse government’s responses as mandated with section 24 of the Bill of Rights for ensuring the right to a clean and healthy environment. The principle of sustainability of the environment encompasses the notion of equity and the harm to the environment affects the present as well as future generations. Section 24 obliges the state to ensure that the legacy of future generations is protected, an obligation that has been said to be part of the common law. The environment is an elusive and wide concept that can mean different things to different people depending on the context in which it is used for example clean drinking water or safe food. An extensive interpretation of the term environment would include almost everything that may positively or negatively influence the quality of human life. The analysis will include assessing policy measures, legislation, budgetary measures and other measures taken by the government in order to progressively meet its constitutional obligation. The opportunity of the child to grow up in a healthy and safe environment is extremely unjustly distributed. Without a realignment of political, legal and economic conditions this situation will not fundamentally change. South Africa as a developing country that needs to meet the demand of social transformation and economic growth whilst at the same time expediting its ability to compete in global markets, the country will inevitably embark on developmental programmes as a measure for sustainable development. The courts would have to inquire into the reasonableness of those measures. Environmental threats to children’s rights must be identified, taking into account children’s specific needs and vulnerabilities, their dependence and marginalisation. Obligations of states and violations of rights must be made more visible to the general public.Keywords: environment, children rights, pollution, healthy, violation
Procedia PDF Downloads 17011803 Python Implementation for S1000D Applicability Depended Processing Model - SALERNO
Authors: Theresia El Khoury, Georges Badr, Amir Hajjam El Hassani, Stéphane N’Guyen Van Ky
Abstract:
The widespread adoption of machine learning and artificial intelligence across different domains can be attributed to the digitization of data over several decades, resulting in vast amounts of data, types, and structures. Thus, data processing and preparation turn out to be a crucial stage. However, applying these techniques to S1000D standard-based data poses a challenge due to its complexity and the need to preserve logical information. This paper describes SALERNO, an S1000d AppLicability dEpended pRocessiNg mOdel. This python-based model analyzes and converts the XML S1000D-based files into an easier data format that can be used in machine learning techniques while preserving the different logic and relationships in files. The model parses the files in the given folder, filters them, and extracts the required information to be saved in appropriate data frames and Excel sheets. Its main idea is to group the extracted information by applicability. In addition, it extracts the full text by replacing internal and external references while maintaining the relationships between files, as well as the necessary requirements. The resulting files can then be saved in databases and used in different models. Documents in both English and French languages were tested, and special characters were decoded. Updates on the technical manuals were taken into consideration as well. The model was tested on different versions of the S1000D, and the results demonstrated its ability to effectively handle the applicability, requirements, references, and relationships across all files and on different levels.Keywords: aeronautics, big data, data processing, machine learning, S1000D
Procedia PDF Downloads 15611802 Revitalizing Coastal Ecosystems: Evaluating the Costs and Benefits of Restoring Clam Gardens for Indigenous Communities in British Columbia
Authors: Daniel Chen, Chengyi Li, Naifu Xu, Shangxuan Yang
Abstract:
Climate change has led to substantial changes in coastal ecosystems, including elevated ocean temperatures, increased acidity, and disrupted marine habitats. These environmental impacts have also resulted in the decline of traditional Indigenous food sources on the coast of British Columbia, including clams and salmon, which have been essential to the diet and cultural practices of the coastal Indigenous communities. This research evaluates and analyzes the costs and benefits of restoring and building clam gardens, an ancestral Indigenous mariculture technique in the Pacific Northwest. Clam gardens, which involve the construction of intertidal rock walls to enhance clam production, have been shown to more than triple clam yields compared to non-walled beaches. This research analyzes the costs and benefits to Indigenous individuals, including factors such as travel, equipment, time, food supply, and cultural engagement; then it discusses the potential of clam gardens as a significant food resource with additional environmental co-benefits, given the prevalence of clam gardens and coastlines in British Columbia. Moreover, the study concludes with policy recommendations to support the restoration and preservation of clam gardens, highlighting their potential to provide sustainable seafood production, environmental co-benefits, and social-environmental educational opportunities for Indigenous communities and the wider public.Keywords: British Columbia coastline, clam garden, coastal resource management, Indigenous communities
Procedia PDF Downloads 1911801 Wasteless Solid-Phase Method for Conversion of Iron Ores Contaminated with Silicon and Phosphorus Compounds
Authors: А. V. Panko, Е. V. Ablets, I. G. Kovzun, М. А. Ilyashov
Abstract:
Based upon generalized analysis of modern know-how in the sphere of processing, concentration and purification of iron-ore raw materials (IORM), in particular, the most widespread ferrioxide-silicate materials (FOSM), containing impurities of phosphorus and other elements compounds, noted special role of nano technological initiatives in improvement of such processes. Considered ideas of role of nano particles in processes of FOSM carbonization with subsequent direct reduction of ferric oxides contained in them to metal phase, as well as in processes of alkali treatment and separation of powered iron from phosphorus compounds. Using the obtained results the wasteless solid-phase processing, concentration and purification of IORM and FOSM from compounds of phosphorus, silicon and other impurities excelling known methods of direct iron reduction from iron ores and metallurgical slimes.Keywords: iron ores, solid-phase reduction, nanoparticles in reduction and purification of iron from silicon and phosphorus, wasteless method of ores processing
Procedia PDF Downloads 48611800 Genomic Sequence Representation Learning: An Analysis of K-Mer Vector Embedding Dimensionality
Authors: James Jr. Mashiyane, Risuna Nkolele, Stephanie J. Müller, Gciniwe S. Dlamini, Rebone L. Meraba, Darlington S. Mapiye
Abstract:
When performing language tasks in natural language processing (NLP), the dimensionality of word embeddings is chosen either ad-hoc or is calculated by optimizing the Pairwise Inner Product (PIP) loss. The PIP loss is a metric that measures the dissimilarity between word embeddings, and it is obtained through matrix perturbation theory by utilizing the unitary invariance of word embeddings. Unlike in natural language, in genomics, especially in genome sequence processing, unlike in natural language processing, there is no notion of a “word,” but rather, there are sequence substrings of length k called k-mers. K-mers sizes matter, and they vary depending on the goal of the task at hand. The dimensionality of word embeddings in NLP has been studied using the matrix perturbation theory and the PIP loss. In this paper, the sufficiency and reliability of applying word-embedding algorithms to various genomic sequence datasets are investigated to understand the relationship between the k-mer size and their embedding dimension. This is completed by studying the scaling capability of three embedding algorithms, namely Latent Semantic analysis (LSA), Word2Vec, and Global Vectors (GloVe), with respect to the k-mer size. Utilising the PIP loss as a metric to train embeddings on different datasets, we also show that Word2Vec outperforms LSA and GloVe in accurate computing embeddings as both the k-mer size and vocabulary increase. Finally, the shortcomings of natural language processing embedding algorithms in performing genomic tasks are discussed.Keywords: word embeddings, k-mer embedding, dimensionality reduction
Procedia PDF Downloads 13711799 Geographic Information System and Dynamic Segmentation of Very High Resolution Images for the Semi-Automatic Extraction of Sandy Accumulation
Authors: A. Bensaid, T. Mostephaoui, R. Nedjai
Abstract:
A considerable area of Algerian lands is threatened by the phenomenon of wind erosion. For a long time, wind erosion and its associated harmful effects on the natural environment have posed a serious threat, especially in the arid regions of the country. In recent years, as a result of increases in the irrational exploitation of natural resources (fodder) and extensive land clearing, wind erosion has particularly accentuated. The extent of degradation in the arid region of the Algerian Mecheria department generated a new situation characterized by the reduction of vegetation cover, the decrease of land productivity, as well as sand encroachment on urban development zones. In this study, we attempt to investigate the potential of remote sensing and geographic information systems for detecting the spatial dynamics of the ancient dune cords based on the numerical processing of LANDSAT images (5, 7, and 8) of three scenes 197/37, 198/36 and 198/37 for the year 2020. As a second step, we prospect the use of geospatial techniques to monitor the progression of sand dunes on developed (urban) lands as well as on the formation of sandy accumulations (dune, dunes fields, nebkha, barkhane, etc.). For this purpose, this study made use of the semi-automatic processing method for the dynamic segmentation of images with very high spatial resolution (SENTINEL-2 and Google Earth). This study was able to demonstrate that urban lands under current conditions are located in sand transit zones that are mobilized by the winds from the northwest and southwest directions.Keywords: land development, GIS, segmentation, remote sensing
Procedia PDF Downloads 15511798 Cost Effective Real-Time Image Processing Based Optical Mark Reader
Authors: Amit Kumar, Himanshu Singal, Arnav Bhavsar
Abstract:
In this modern era of automation, most of the academic exams and competitive exams are Multiple Choice Questions (MCQ). The responses of these MCQ based exams are recorded in the Optical Mark Reader (OMR) sheet. Evaluation of the OMR sheet requires separate specialized machines for scanning and marking. The sheets used by these machines are special and costs more than a normal sheet. Available process is non-economical and dependent on paper thickness, scanning quality, paper orientation, special hardware and customized software. This study tries to tackle the problem of evaluating the OMR sheet without any special hardware and making the whole process economical. We propose an image processing based algorithm which can be used to read and evaluate the scanned OMR sheets with no special hardware required. It will eliminate the use of special OMR sheet. Responses recorded in normal sheet is enough for evaluation. The proposed system takes care of color, brightness, rotation, little imperfections in the OMR sheet images.Keywords: OMR, image processing, hough circle trans-form, interpolation, detection, binary thresholding
Procedia PDF Downloads 17311797 Functional to Business Process Orientation in Business Schools
Authors: Sunitha Thappa
Abstract:
Business environment is a set of complex interdependent dimensions that corporates have to always be vigil in identifying the influential waves. Over the year business environment has evolved into a basket of uncertainties. Every organization strives to counter this dynamic nature of business environment by recurrently evaluating the primary and support activities of its value chain. This has led to companies redesigning their business models, reinvent business processes and operating procedure on unremitting basis. A few specific issues that are placed before the present day managers are breaking down the functional interpretation of any challenge that organizations confronts, reduction in organizational hierarchy and tackling the components of the value chain to retain their competitive advantage. It is how effectively managers detect the changes and swiftly reorient themselves to these changes that define their success or failure. Given the complexity of decision making in this dynamic environment, two important question placed before the B-schools of today. Firstly, are they grooming and nurturing managerial talent proficient enough to thrive in this multifaceted business environment? Secondly, are the management graduates walking through their portals, able to view challenges from a cross-functional perspective with emphasis to customer and process rather than hierarchy and functions. This paper focuses on the need for a process oriented approach to management education.Keywords: management education, pedagogy, functional, process
Procedia PDF Downloads 33211796 Economic Environment and Entrepreneurial Development in Lagos and Ogun States, Nigeria
Authors: Jayeola Olabisi, T. Olawale Oladunjoye, Ademola A. Adewumi
Abstract:
The study empirically examines the relationship that exists between the economic environment and entrepreneurial development in Nigeria. A structured questionnaire is administered on the study and data collected are analysed using Analysis of Variance and Regression. The following variables are indices of determination; Interest Rate (IR); Income Tax (IT). The results of the study show that there is a significant relationship between IR and ED in Nigeria (p < 0.5) with a positive correlation (r=0.526, r2=0.276). Also, there is a significant relationship between IT and ED in Nigeria (p < 0.05), with a positive association (r=0.546; r2=0.299). The study concludes that the emergence of the higher level of the stable economic environment is critical to entrepreneurial development in Nigeria. Therefore, government involvement in public private partnership for infrastructural development, enlargement of productive, judicious and transparent use of funds collected from income tax and affordable interest rate will galvanise the inward sourcing of raw materials that boost entrepreneurial development in Nigeria.Keywords: interest rate, income tax, business environment and entrepreneurial development
Procedia PDF Downloads 36111795 Achieving Flow at Work: An Experience Sampling Study to Comprehend How Cognitive Task Characteristics and Work Environments Predict Flow Experiences
Authors: Jonas De Kerf, Rein De Cooman, Sara De Gieter
Abstract:
For many decades, scholars have aimed to understand how work can become more meaningful by maximizing both potential and enhancing feelings of satisfaction. One of the largest contributions towards such positive psychology was made with the introduction of the concept of ‘flow,’ which refers to a condition in which people feel intense engagement and effortless action. Since then, valuable research on work-related flow has indicated that this state of mind is related to positive outcomes for both organizations (e.g., social, supportive climates) and workers (e.g., job satisfaction). Yet, scholars still do not fully comprehend how such deep involvement at work is obtained, given the notion that flow is considered a short-term, complex, and dynamic experience. Most research neglects that people who experience flow ought to be optimally challenged so that intense concentration is required. Because attention is at the core of this enjoyable state of mind, this study aims to comprehend how elements that affect workers’ cognitive functioning impact flow at work. Research on cognitive performance suggests that working on mentally demanding tasks (e.g., information processing tasks) requires workers to concentrate deeply, as a result leading to flow experiences. Based on social facilitation theory, working on such tasks in an isolated environment eases concentration. Prior research has indicated that working at home (instead of working at the office) or in a closed office (rather than in an open-plan office) impacts employees’ overall functioning in terms of concentration and productivity. Consequently, we advance such knowledge and propose an interaction by combining cognitive task characteristics and work environments among part-time teleworkers. Hence, we not only aim to shed light on the relation between cognitive tasks and flow but also provide empirical evidence that workers performing such tasks achieve the highest states of flow while working either at home or in closed offices. In July 2022, an experience-sampling study will be conducted that uses a semi-random signal schedule to understand how task and environment predictors together impact part-time teleworkers’ flow. More precisely, about 150 knowledge workers will fill in multiple surveys a day for two consecutive workweeks to report their flow experiences, cognitive tasks, and work environments. Preliminary results from a pilot study indicate that on a between level, tasks high in information processing go along with high self-reported fluent productivity (i.e., making progress). As expected, evidence was found for higher fluency in productivity for workers performing information processing tasks both at home and in a closed office, compared to those performing the same tasks at the office or in open-plan offices. This study expands the current knowledge on work-related flow by looking at a task and environmental predictors that enable workers to obtain such a peak state. While doing so, our findings suggest that practitioners should strive for ideal alignments between tasks and work locations to work with both deep involvement and gratification.Keywords: cognitive work, office lay-out, work location, work-related flow
Procedia PDF Downloads 10011794 Mixotropohic Growth of Chlorella sp. on Raw Food Processing Industrial Wastewater: Effect of COD Tolerance
Authors: Suvidha Gupta, R. A. Pandey, Sanjay Pawar
Abstract:
The effluents from various food processing industries are found with high BOD, COD, suspended solids, nitrate, and phosphate. Mixotrophic growth of microalgae using food processing industrial wastewater as an organic carbon source has emerged as more effective and energy intensive means for the nutrient removal and COD reduction. The present study details the treatment of non-sterilized unfiltered food processing industrial wastewater by microalgae for nutrient removal as well as to determine the tolerance to COD by taking different dilutions of wastewater. In addition, the effect of different inoculum percentages of microalgae on removal efficiency of the nutrients for given dilution has been studied. To see the effect of dilution and COD tolerance, the wastewater having initial COD 5000 mg/L (±5), nitrate 28 mg/L (±10), and phosphate 24 mg/L (±10) was diluted to get COD of 3000 mg/L and 1000 mg/L. The experiments were carried out in 1L conical flask by intermittent aeration with different inoculum percentage i.e. 10%, 20%, and 30% of Chlorella sp. isolated from nearby area of NEERI, Nagpur. The experiments were conducted for 6 days by providing 12:12 light- dark period and determined various parameters such as COD, TOC, NO3-- N, PO4-- P, and total solids on daily basis. Results revealed that, for 10% and 20% inoculum, over 90% COD and TOC reduction was obtained with wastewater containing COD of 3000 mg/L whereas over 80% COD and TOC reduction was obtained with wastewater containing COD of 1000 mg/L. Moreover, microalgae was found to tolerate wastewater containing COD 5000 mg/L and obtained over 60% and 80% reduction in COD and TOC respectively. The obtained results were found similar with 10% and 20% inoculum in all COD dilutions whereas for 30% inoculum over 60% COD and 70% TOC reduction was obtained. In case of nutrient removal, over 70% nitrate removal and 45% phosphate removal was obtained with 20% inoculum in all dilutions. The obtained results indicated that Microalgae assisted nutrient removal gives maximum COD and TOC reduction with 3000 mg/L COD and 20% inoculum. Hence, microalgae assisted wastewater treatment is not only effective for removal of nutrients but also can tolerate high COD up to 5000 mg/L and solid content.Keywords: Chlorella sp., chemical oxygen demand, food processing industrial wastewater, mixotrophic growth
Procedia PDF Downloads 33111793 Tool Condition Monitoring of Ceramic Inserted Tools in High Speed Machining through Image Processing
Authors: Javier A. Dominguez Caballero, Graeme A. Manson, Matthew B. Marshall
Abstract:
Cutting tools with ceramic inserts are often used in the process of machining many types of superalloy, mainly due to their high strength and thermal resistance. Nevertheless, during the cutting process, the plastic flow wear generated in these inserts enhances and propagates cracks due to high temperature and high mechanical stress. This leads to a very variable failure of the cutting tool. This article explores the relationship between the continuous wear that ceramic SiAlON (solid solutions based on the Si3N4 structure) inserts experience during a high-speed machining process and the evolution of sparks created during the same process. These sparks were analysed through pictures of the cutting process recorded using an SLR camera. Features relating to the intensity and area of the cutting sparks were extracted from the individual pictures using image processing techniques. These features were then related to the ceramic insert’s crater wear area.Keywords: ceramic cutting tools, high speed machining, image processing, tool condition monitoring, tool wear
Procedia PDF Downloads 29811792 A Study of the Planning and Designing of the Built Environment under the Green Transit-Oriented Development
Authors: Wann-Ming Wey
Abstract:
In recent years, the problems of global climate change and natural disasters have induced the concerns and attentions of environmental sustainability issues for the public. Aside from the environmental planning efforts done for human environment, Transit-Oriented Development (TOD) has been widely used as one of the future solutions for the sustainable city development. In order to be more consistent with the urban sustainable development, the development of the built environment planning based on the concept of Green TOD which combines both TOD and Green Urbanism is adapted here. The connotation of the urban development under the green TOD including the design toward environment protect, the maximum enhancement resources and the efficiency of energy use, use technology to construct green buildings and protected areas, natural ecosystems and communities linked, etc. Green TOD is not only to provide the solution to urban traffic problems, but to direct more sustainable and greener consideration for future urban development planning and design. In this study, we use both the TOD and Green Urbanism concepts to proceed to the study of the built environment planning and design. Fuzzy Delphi Technique (FDT) is utilized to screen suitable criteria of the green TOD. Furthermore, Fuzzy Analytic Network Process (FANP) and Quality Function Deployment (QFD) were then developed to evaluate the criteria and prioritize the alternatives. The study results can be regarded as the future guidelines of the built environment planning and designing under green TOD development in Taiwan.Keywords: green TOD, built environment, fuzzy delphi technique, quality function deployment, fuzzy analytic network process
Procedia PDF Downloads 38411791 Optimizing Machine Learning Through Python Based Image Processing Techniques
Authors: Srinidhi. A, Naveed Ahmed, Twinkle Hareendran, Vriksha Prakash
Abstract:
This work reviews some of the advanced image processing techniques for deep learning applications. Object detection by template matching, image denoising, edge detection, and super-resolution modelling are but a few of the tasks. The paper looks in into great detail, given that such tasks are crucial preprocessing steps that increase the quality and usability of image datasets in subsequent deep learning tasks. We review some of the methods for the assessment of image quality, more specifically sharpness, which is crucial to ensure a robust performance of models. Further, we will discuss the development of deep learning models specific to facial emotion detection, age classification, and gender classification, which essentially includes the preprocessing techniques interrelated with model performance. Conclusions from this study pinpoint the best practices in the preparation of image datasets, targeting the best trade-off between computational efficiency and retaining important image features critical for effective training of deep learning models.Keywords: image processing, machine learning applications, template matching, emotion detection
Procedia PDF Downloads 1311790 Fundamental Study on Reconstruction of 3D Image Using Camera and Ultrasound
Authors: Takaaki Miyabe, Hideharu Takahashi, Hiroshige Kikura
Abstract:
The Government of Japan and Tokyo Electric Power Company Holdings, Incorporated (TEPCO) are struggling with the decommissioning of Fukushima Daiichi Nuclear Power Plants, especially fuel debris retrieval. In fuel debris retrieval, amount of fuel debris, location, characteristics, and distribution information are important. Recently, a survey was conducted using a robot with a small camera. Progress report in remote robot and camera research has speculated that fuel debris is present both at the bottom of the Pressure Containment Vessel (PCV) and inside the Reactor Pressure Vessel (RPV). The investigation found a 'tie plate' at the bottom of the containment, this is handles on the fuel rod. As a result, it is assumed that a hole large enough to allow the tie plate to fall is opened at the bottom of the reactor pressure vessel. Therefore, exploring the existence of holes that lead to inside the RCV is also an issue. Investigations of the lower part of the RPV are currently underway, but no investigations have been made inside or above the PCV. Therefore, a survey must be conducted for future fuel debris retrieval. The environment inside of the RPV cannot be imagined due to the effect of the melted fuel. To do this, we need a way to accurately check the internal situation. What we propose here is the adaptation of a technology called 'Structure from Motion' that reconstructs a 3D image from multiple photos taken by a single camera. The plan is to mount a monocular camera on the tip of long-arm robot, reach it to the upper part of the PCV, and to taking video. Now, we are making long-arm robot that has long-arm and used at high level radiation environment. However, the environment above the pressure vessel is not known exactly. Also, fog may be generated by the cooling water of fuel debris, and the radiation level in the environment may be high. Since camera alone cannot provide sufficient sensing in these environments, we will further propose using ultrasonic measurement technology in addition to cameras. Ultrasonic sensor can be resistant to environmental changes such as fog, and environments with high radiation dose. these systems can be used for a long time. The purpose is to develop a system adapted to the inside of the containment vessel by combining a camera and an ultrasound. Therefore, in this research, we performed a basic experiment on 3D image reconstruction using a camera and ultrasound. In this report, we select the good and bad condition of each sensing, and propose the reconstruction and detection method. The results revealed the strengths and weaknesses of each approach.Keywords: camera, image processing, reconstruction, ultrasound
Procedia PDF Downloads 10411789 Reduction of Residual Stress by Variothermal Processing and Validation via Birefringence Measurement Technique on Injection Molded Polycarbonate Samples
Authors: Christoph Lohr, Hanna Wund, Peter Elsner, Kay André Weidenmann
Abstract:
Injection molding is one of the most commonly used techniques in the industrial polymer processing. In the conventional process of injection molding, the liquid polymer is injected into the cavity of the mold, where the polymer directly starts hardening at the cooled walls. To compensate the shrinkage, which is caused predominantly by the immediate cooling, holding pressure is applied. Through that whole process, residual stresses are produced by the temperature difference of the polymer melt and the injection mold and the relocation of the polymer chains, which were oriented by the high process pressures and injection speeds. These residual stresses often weaken or change the structural behavior of the parts or lead to deformation of components. One solution to reduce the residual stresses is the use of variothermal processing. Hereby the mold is heated – i.e. near/over the glass transition temperature of the polymer – the polymer is injected and before opening the mold and ejecting the part the mold is cooled. For the next cycle, the mold gets heated again and the procedure repeats. The rapid heating and cooling of the mold are realized indirectly by convection of heated and cooled liquid (here: water) which is pumped through fluid channels underneath the mold surface. In this paper, the influences of variothermal processing on the residual stresses are analyzed with samples in a larger scale (500 mm x 250 mm x 4 mm). In addition, the influence on functional elements, such as abrupt changes in wall thickness, bosses, and ribs, on the residual stress is examined. Therefore the polycarbonate samples are produced by variothermal and isothermal processing. The melt is injected into a heated mold, which has in our case a temperature varying between 70 °C and 160 °C. After the filling of the cavity, the closed mold is cooled down varying from 70 °C to 100 °C. The pressure and temperature inside the mold are monitored and evaluated with cavity sensors. The residual stresses of the produced samples are illustrated by birefringence where the effect on the refractive index on the polymer under stress is used. The colorful spectrum can be uncovered by placing the sample between a polarized light source and a second polarization filter. To show the achievement and processing effects on the reduction of residual stress the birefringence images of the isothermal and variothermal produced samples are compared and evaluated. In this comparison to the variothermal produced samples have a lower amount of maxima of each color spectrum than the isothermal produced samples, which concludes that the residual stress of the variothermal produced samples is lower.Keywords: birefringence, injection molding, polycarbonate, residual stress, variothermal processing
Procedia PDF Downloads 28311788 Understanding the Heart of the Matter: A Pedagogical Framework for Apprehending Successful Second Language Development
Authors: Cinthya Olivares Garita
Abstract:
Untangling language processing in second language development has been either a taken-for-granted and overlooked task for some English language teaching (ELT) instructors or a considerable feat for others. From the most traditional language instruction to the most communicative methodologies, how to assist L2 learners in processing language in the classroom has become a challenging matter in second language teaching. Amidst an ample array of methods, strategies, and techniques to teach a target language, finding a suitable model to lead learners to process, interpret, and negotiate meaning to communicate in a second language has imposed a great responsibility on language teachers; committed teachers are those who are aware of their role in equipping learners with the appropriate tools to communicate in the target language in a 21stcentury society. Unfortunately, one might find some English language teachers convinced that their job is only to lecture students; others are advocates of textbook-based instruction that might hinder second language processing, and just a few might courageously struggle to facilitate second language learning effectively. Grounded on the most representative empirical studies on comprehensible input, processing instruction, and focus on form, this analysis aims to facilitate the understanding of how second language learners process and automatize input and propose a pedagogical framework for the successful development of a second language. In light of this, this paper is structured to tackle noticing and attention and structured input as the heart of processing instruction, comprehensible input as the missing link in second language learning, and form-meaning connections as opposed to traditional grammar approaches to language teaching. The author finishes by suggesting a pedagogical framework involving noticing-attention-comprehensible-input-form (NACIF based on their acronym) to support ELT instructors, teachers, and scholars on the challenging task of facilitating the understanding of effective second language development.Keywords: second language development, pedagogical framework, noticing, attention, comprehensible input, form
Procedia PDF Downloads 2811787 The Security Challenges of Urbanization and Environmental Degradation in the Niger-Delta Area of Nigeria
Authors: Gloria Ogungbade, Ogaba Oche, Moses Duruji, Chris Ehiobuche, Lady Ajayi
Abstract:
Human’s continued sustenance on earth and the quality of living are heavily dependent on the environment. The major components of the environment being air, water and land are the supporting pillars of the human existence, which they depend on directly or indirectly for survival and well-being. Unfortunately, due to some of the human activities on the environment, there seems to be a war between humans and the environment, which is evident in his over-exploitation and inadequate management of the basic components of the environment. Since the discovery of crude oil in the Niger Delta, the region has experienced various forms of degradation caused by pollution from oil spillage, gas flaring and other forms of environmental pollution, as a result of reckless way and manner with which oil is being exploited by the International Oil Corporations (IOCs) operating within the region. The Nigerian government on the other, not having strong regulations guiding the activities of the operations of these IOCs, has done almost nothing to curtail the activities of these IOCs because of the revenue generated the IOCs, as such the region is deprived of the basic social amenities and infrastructures. The degree of environmental pollution suffered within the region affects their major sources of livelihood – being fishing and farming, and has also left the region in poverty, which has led to a large number of people migrating to the urban areas to escape poverty. This paper investigates how environment degradation impact urbanization and security in the region.Keywords: environmental degradation, environmental pollution, gas flaring, oil spillage, urbanization
Procedia PDF Downloads 28911786 Teaching Environment and Instructional Materials on Students’ Performance in English Language: Implications for Counselling
Authors: Rosemary Saidu, Taiyelolu Martins Ogunjirin
Abstract:
The study examines the teaching environment and instructional materials on the performance of students in the English Language in selected secondary schools in Ogun State and its implication for counselling. Two research questions guided the study were developed. The study adopted a descriptive survey design. A multi-stage sampling technique was employed for the study. Samples of 100 students of Senior Secondary School Two (SSS11) were drawn. Purposive sampling technique was to select the five schools. Additionally, the instruments known as Teaching Environment and Instructional Materials on Students Performance in English Inventory (TEIMEI) and Student Achievement Scores (SAS) were used to elicit information. Thereafter, inferential statistics and the non-parametric chi-square statistics at 0.05 alpha levels and 3 degree of freedom were adopted as analytical tools. From the study, it was discovered among others that teaching environment and instructional materials significantly contributed to the performance of students in the English language. From the findings, it was recommended that among others functional language laboratory in the schools, counselors to regularly give guidance talk on the importance of the subject.Keywords: performance, English language, teaching environment, instructional materials
Procedia PDF Downloads 15711785 Evaluating the Implementation of Machine Learning Techniques in the South African Built Environment
Authors: Peter Adekunle, Clinton Aigbavboa, Matthew Ikuabe, Opeoluwa Akinradewo
Abstract:
The future of machine learning (ML) in building may seem like a distant idea that will take decades to materialize, but it is actually far closer than previously believed. In reality, the built environment has been progressively increasing interest in machine learning. Although it could appear to be a very technical, impersonal approach, it can really make things more personable. Instead of eliminating humans out of the equation, machine learning allows people do their real work more efficiently. It is therefore vital to evaluate the factors influencing the implementation and challenges of implementing machine learning techniques in the South African built environment. The study's design was one of a survey. In South Africa, construction workers and professionals were given a total of one hundred fifty (150) questionnaires, of which one hundred and twenty-four (124) were returned and deemed eligible for study. Utilizing percentage, mean item scores, standard deviation, and Kruskal-Wallis, the collected data was analyzed. The results demonstrate that the top factors influencing the adoption of machine learning are knowledge level and a lack of understanding of its potential benefits. While lack of collaboration among stakeholders and lack of tools and services are the key hurdles to the deployment of machine learning within the South African built environment. The study came to the conclusion that ML adoption should be promoted in order to increase safety, productivity, and service quality within the built environment.Keywords: machine learning, implementation, built environment, construction stakeholders
Procedia PDF Downloads 13211784 Image Processing of Scanning Electron Microscope Micrograph of Ferrite and Pearlite Steel for Recognition of Micro-Constituents
Authors: Subir Gupta, Subhas Ganguly
Abstract:
In this paper, we demonstrate the new area of application of image processing in metallurgical images to develop the more opportunity for structure-property correlation based approaches of alloy design. The present exercise focuses on the development of image processing tools suitable for phrase segmentation, grain boundary detection and recognition of micro-constituents in SEM micrographs of ferrite and pearlite steels. A comprehensive data of micrographs have been experimentally developed encompassing the variation of ferrite and pearlite volume fractions and taking images at different magnification (500X, 1000X, 15000X, 2000X, 3000X and 5000X) under scanning electron microscope. The variation in the volume fraction has been achieved using four different plain carbon steel containing 0.1, 0.22, 0.35 and 0.48 wt% C heat treated under annealing and normalizing treatments. The obtained data pool of micrographs arbitrarily divided into two parts to developing training and testing sets of micrographs. The statistical recognition features for ferrite and pearlite constituents have been developed by learning from training set of micrographs. The obtained features for microstructure pattern recognition are applied to test set of micrographs. The analysis of the result shows that the developed strategy can successfully detect the micro constitutes across the wide range of magnification and variation of volume fractions of the constituents in the structure with an accuracy of about +/- 5%.Keywords: SEM micrograph, metallurgical image processing, ferrite pearlite steel, microstructure
Procedia PDF Downloads 19911783 Development of Fake News Model Using Machine Learning through Natural Language Processing
Authors: Sajjad Ahmed, Knut Hinkelmann, Flavio Corradini
Abstract:
Fake news detection research is still in the early stage as this is a relatively new phenomenon in the interest raised by society. Machine learning helps to solve complex problems and to build AI systems nowadays and especially in those cases where we have tacit knowledge or the knowledge that is not known. We used machine learning algorithms and for identification of fake news; we applied three classifiers; Passive Aggressive, Naïve Bayes, and Support Vector Machine. Simple classification is not completely correct in fake news detection because classification methods are not specialized for fake news. With the integration of machine learning and text-based processing, we can detect fake news and build classifiers that can classify the news data. Text classification mainly focuses on extracting various features of text and after that incorporating those features into classification. The big challenge in this area is the lack of an efficient way to differentiate between fake and non-fake due to the unavailability of corpora. We applied three different machine learning classifiers on two publicly available datasets. Experimental analysis based on the existing dataset indicates a very encouraging and improved performance.Keywords: fake news detection, natural language processing, machine learning, classification techniques.
Procedia PDF Downloads 16711782 Induction Machine Bearing Failure Detection Using Advanced Signal Processing Methods
Authors: Abdelghani Chahmi
Abstract:
This article examines the detection and localization of faults in electrical systems, particularly those using asynchronous machines. First, the process of failure will be characterized, relevant symptoms will be defined and based on those processes and symptoms, a model of those malfunctions will be obtained. Second, the development of the diagnosis of the machine will be shown. As studies of malfunctions in electrical systems could only rely on a small amount of experimental data, it has been essential to provide ourselves with simulation tools which allowed us to characterize the faulty behavior. Fault detection uses signal processing techniques in known operating phases.Keywords: induction motor, modeling, bearing damage, airgap eccentricity, torque variation
Procedia PDF Downloads 13911781 Normalized P-Laplacian: From Stochastic Game to Image Processing
Authors: Abderrahim Elmoataz
Abstract:
More and more contemporary applications involve data in the form of functions defined on irregular and topologically complicated domains (images, meshs, points clouds, networks, etc). Such data are not organized as familiar digital signals and images sampled on regular lattices. However, they can be conveniently represented as graphs where each vertex represents measured data and each edge represents a relationship (connectivity or certain affinities or interaction) between two vertices. Processing and analyzing these types of data is a major challenge for both image and machine learning communities. Hence, it is very important to transfer to graphs and networks many of the mathematical tools which were initially developed on usual Euclidean spaces and proven to be efficient for many inverse problems and applications dealing with usual image and signal domains. Historically, the main tools for the study of graphs or networks come from combinatorial and graph theory. In recent years there has been an increasing interest in the investigation of one of the major mathematical tools for signal and image analysis, which are Partial Differential Equations (PDEs) variational methods on graphs. The normalized p-laplacian operator has been recently introduced to model a stochastic game called tug-of-war-game with noise. Part interest of this class of operators arises from the fact that it includes, as particular case, the infinity Laplacian, the mean curvature operator and the traditionnal Laplacian operators which was extensiveley used to models and to solve problems in image processing. The purpose of this paper is to introduce and to study a new class of normalized p-Laplacian on graphs. The introduction is based on the extension of p-harmonious function introduced in as discrete approximation for both infinity Laplacian and p-Laplacian equations. Finally, we propose to use these operators as a framework for solving many inverse problems in image processing.Keywords: normalized p-laplacian, image processing, stochastic game, inverse problems
Procedia PDF Downloads 512