Search results for: automation tool
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5376

Search results for: automation tool

4446 Impact of Newspaper Coverage of 2015 General Elections in Nigeria

Authors: Shola H. Adeosun, Lekan M. Togunwa, Kolawole Z. Amos

Abstract:

This paper appraises ‘Newspaper Coverage of 2015 General Election: A study of The Punch and Guardian Newspapers’. The objectives of the study were to examine how credible newspaper reports of 2015 election were and to examine the significant role Nigeria Newspapers played in the 2015 general elections. Also this study examined the extent at which the print media contributed to the success of 2015 general election and to ascertain the extent at which print media reports serve as a tool for sensitizing the masses. The research questions that guided this research include: How credible was newspaper report of 2015 general election? To what extent did the print media contributed to the success of 2015 general elections? To what extent did the print media reports serve as a tool for sensitizing the masses? The research work was given solid theoretical foundation with the review of Agenda-setting theory, Media System Dependency Theory and Normative theories. This study was given solid theoretical foundation with the review of Agenda-setting theory, Media Dependency Theory and Normative theories. The theory was conducted using content analysis method of research and 30 publications of both The Guardian and Punch Newspaper between January 1st and March 30, 2015 forms the population for this research work. Selection of the dates and editions of Newspaper under study were done using the composite week sampling technique. All the days of the week were used for the newspapers because they (The Punch and The Guardian) are published all the days of the week. Coding sheet was the tool of data collection for the content analysis of this study. Findings of the study revealed that by the Punch newspaper and Guardian has played a significant role in eradicating election malpractices in Nigeria. It therefore concludes that media is metaphoric when we termed it to be a watchdog of the nation as well the mirror through which the nation see and recognize itself. The study also recommends that Nigerian media should strike balance between entertainment stories, crisis stories, economic stories, law story, education stories, terrorism stories, health stories, sport stories, metropolitan stories instead of portraying the country as being crime oriented.

Keywords: newspaper, coverage, general elections, impact

Procedia PDF Downloads 335
4445 Health Literacy Levels of South African Primary Health Care Patients

Authors: Boitumelo Ditshwane, Zelda Janse van Rensburg, Wanda Jacobs,

Abstract:

Health literacy is defined as competencies and skills that individuals need to find, comprehend, evaluate, and use to make knowledgeable choices to improve their health and well-being. Low health literacy has been found to affect people’s ability to take care of their own health. Incomprehension of health education and health care instructions due to low health literacy is often due to information given at a level that is above the patient’s level of understanding. The study aimed to test the health literacy levels of South African PHC patients using a previously developed health literacy assessment tool. Determining health literacy levels may assist PHC nurses in providing health education and health care instructions to the patient on the patient’s level of understanding and, therefore, ensuring positive health outcomes for the patient. A health literacy assessment tool, translated into ten official South African languages, was used to quantitatively determine the health literacy levels of 400 PHC patients in five clinics in Gauteng, South Africa. Patients’ health literacy levels were tested in English, and nine other official languages spoken in South Africa and were compared. The results revealed that patients understand information better when given in their preferred language. Giving health education in a language and level that is better understood by the patient may lead to better health outcomes and prevent adverse health. Patients may better understand instructions provided, be more likely to follow the correct route of medication, honor appointments, comply with medication, and thus have better treatment outcomes.

Keywords: health literacy, primary health care, South Africa, patients

Procedia PDF Downloads 79
4444 An in silico Approach for Exploring the Intercellular Communication in Cancer Cells

Authors: M. Cardenas-Garcia, P. P. Gonzalez-Perez

Abstract:

Intercellular communication is a necessary condition for cellular functions and it allows a group of cells to survive as a population. Throughout this interaction, the cells work in a coordinated and collaborative way which facilitates their survival. In the case of cancerous cells, these take advantage of intercellular communication to preserve their malignancy, since through these physical unions they can send signs of malignancy. The Wnt/β-catenin signaling pathway plays an important role in the formation of intercellular communications, being also involved in a large number of cellular processes such as proliferation, differentiation, adhesion, cell survival, and cell death. The modeling and simulation of cellular signaling systems have found valuable support in a wide range of modeling approaches, which cover a wide spectrum ranging from mathematical models; e.g., ordinary differential equations, statistical methods, and numerical methods– to computational models; e.g., process algebra for modeling behavior and variation in molecular systems. Based on these models, different simulation tools have been developed from mathematical ones to computational ones. Regarding cellular and molecular processes in cancer, its study has also found a valuable support in different simulation tools that, covering a spectrum as mentioned above, have allowed the in silico experimentation of this phenomenon at the cellular and molecular level. In this work, we simulate and explore the complex interaction patterns of intercellular communication in cancer cells using the Cellulat bioinformatics tool, a computational simulation tool developed by us and motivated by two key elements: 1) a biochemically inspired model of self-organizing coordination in tuple spaces, and 2) the Gillespie’s algorithm, a stochastic simulation algorithm typically used to mimic systems of chemical/biochemical reactions in an efficient and accurate way. The main idea behind the Cellulat simulation tool is to provide an in silico experimentation environment that complements and guides in vitro experimentation in intra and intercellular signaling networks. Unlike most of the cell signaling simulation tools, such as E-Cell, BetaWB and Cell Illustrator which provides abstractions to model only intracellular behavior, Cellulat is appropriate for modeling both intracellular signaling and intercellular communication, providing the abstractions required to model –and as a result, simulate– the interaction mechanisms that involve two or more cells, that is essential in the scenario discussed in this work. During the development of this work we made evident the application of our computational simulation tool (Cellulat) for the modeling and simulation of intercellular communication between normal and cancerous cells, and in this way, propose key molecules that may prevent the arrival of malignant signals to the cells that surround the tumor cells. In this manner, we could identify the significant role that has the Wnt/β-catenin signaling pathway in cellular communication, and therefore, in the dissemination of cancer cells. We verified, using in silico experiments, how the inhibition of this signaling pathway prevents that the cells that surround a cancerous cell are transformed.

Keywords: cancer cells, in silico approach, intercellular communication, key molecules, modeling and simulation

Procedia PDF Downloads 249
4443 Student Project on Using a Spreadsheet for Solving Differential Equations by Euler's Method

Authors: Andriy Didenko, Zanin Kavazovic

Abstract:

Engineering students often have certain difficulties in mastering major theoretical concepts in mathematical courses such as differential equations. Student projects were proposed to motivate students’ learning and can be used as a tool to promote students’ interest in the material. Authors propose a student project that includes the use of Microsoft Excel. This instructional tool is often overlooked by both educators and students. An integral component of the experimental part of such a project is the exploration of an interactive spreadsheet. The aim is to assist engineering students in better understanding of Euler’s method. This method is employed to numerically solve first order differential equations. At first, students are invited to select classic equations from a list presented in a form of a drop-down menu. For each of these equations, students can select and modify certain key parameters and observe the influence of initial condition on the solution. This will give students an insight into the behavior of the method in different configurations as solutions to equations are given in numerical and graphical forms. Further, students could also create their own equations by providing functions of their own choice and a variety of initial conditions. Moreover, they can visualize and explore the impact of the length of the time step on the convergence of a sequence of numerical solutions to the exact solution of the equation. As a final stage of the project, students are encouraged to develop their own spreadsheets for other numerical methods and other types of equations. Such projects promote students’ interest in mathematical applications and further improve their mathematical and programming skills.

Keywords: student project, Euler's method, spreadsheet, engineering education

Procedia PDF Downloads 135
4442 A Case Study on the Value of Corporate Social Responsibility Systems

Authors: José M. Brotons, Manuel E. Sansalvador

Abstract:

The relationship between Corporate Social Responsibility (CSR) and financial performance (FP) is a subject of great interest that has not yet been resolved. In this work, we have developed a new and original tool to measure this relation. The tool quantifies the value contributed to companies that are committed to CSR. The theoretical model used is the fuzzy discounted cash flow method. Two assumptions have been considered, the first, the company has implemented the IQNet SR10 certification, and the second, the company has not implemented that certification. For the first one, the growth rate used for the time horizon is the rate maintained by the company after obtaining the IQNet SR10 certificate. For the second one, both, the growth rates company prior to the implementation of the certification, and the evolution of the sector will be taken into account. By using triangular fuzzy numbers, it is possible to deal adequately with each company’s forecasts as well as the information corresponding to the sector. Once the annual growth rate of the sales is obtained, the profit and loss accounts are generated from the annual estimate sales. For the remaining elements of this account, their regression with the nets sales has been considered. The difference between these two valuations, made in a fuzzy environment, allows obtaining the value of the IQNet SR10 certification. Although this study presents an innovative methodology to quantify the relation between CSR and FP, the authors are aware that only one company has been analyzed. This is precisely the main limitation of this study which in turn opens up an interesting line for future research: to broaden the sample of companies.

Keywords: corporate social responsibility, case study, financial performance, company valuation

Procedia PDF Downloads 187
4441 Applying Artificial Neural Networks to Predict Speed Skater Impact Concussion Risk

Authors: Yilin Liao, Hewen Li, Paula McConvey

Abstract:

Speed skaters often face a risk of concussion when they fall on the ice floor and impact crash mats during practices and competitive races. Several variables, including those related to the skater, the crash mat, and the impact position (body side/head/feet impact), are believed to influence the severity of the skater's concussion. While computer simulation modeling can be employed to analyze these accidents, the simulation process is time-consuming and does not provide rapid information for coaches and teams to assess the skater's injury risk in competitive events. This research paper promotes the exploration of the feasibility of using AI techniques for evaluating skater’s potential concussion severity, and to develop a fast concussion prediction tool using artificial neural networks to reduce the risk of treatment delays for injured skaters. The primary data is collected through virtual tests and physical experiments designed to simulate skater-mat impact. It is then analyzed to identify patterns and correlations; finally, it is used to train and fine-tune the artificial neural networks for accurate prediction. The development of the prediction tool by employing machine learning strategies contributes to the application of AI methods in sports science and has theoretical involvements for using AI techniques in predicting and preventing sports-related injuries.

Keywords: artificial neural networks, concussion, machine learning, impact, speed skater

Procedia PDF Downloads 109
4440 Welding Process Selection for Storage Tank by Integrated Data Envelopment Analysis and Fuzzy Credibility Constrained Programming Approach

Authors: Rahmad Wisnu Wardana, Eakachai Warinsiriruk, Sutep Joy-A-Ka

Abstract:

Selecting the most suitable welding process usually depends on experiences or common application in similar companies. However, this approach generally ignores many criteria that can be affecting the suitable welding process selection. Therefore, knowledge automation through knowledge-based systems will significantly improve the decision-making process. The aims of this research propose integrated data envelopment analysis (DEA) and fuzzy credibility constrained programming approach for identifying the best welding process for stainless steel storage tank in the food and beverage industry. The proposed approach uses fuzzy concept and credibility measure to deal with uncertain data from experts' judgment. Furthermore, 12 parameters are used to determine the most appropriate welding processes among six competitive welding processes.

Keywords: welding process selection, data envelopment analysis, fuzzy credibility constrained programming, storage tank

Procedia PDF Downloads 167
4439 Automated Buffer Box Assembly Cell Concept for the Canadian Used Fuel Packing Plant

Authors: Dimitrie Marinceu, Alan Murchison

Abstract:

The Canadian Used Fuel Container (UFC) is a mid-size hemispherical headed copper coated steel container measuring 2.5 meters in length and 0.5 meters in diameter containing 48 used fuel bundles. The contained used fuel produces significant gamma radiation requiring automated assembly processes to complete the assembly. The design throughput of 2,500 UFCs per year places constraints on equipment and hot cell design for repeatability, speed of processing, robustness and recovery from upset conditions. After UFC assembly, the UFC is inserted into a Buffer Box (BB). The BB is made from adequately pre-shaped blocks (lower and upper block) and Highly Compacted Bentonite (HCB) material. The blocks are practically ‘sandwiching’ the UFC between them after assembly. This paper identifies one possible approach for the BB automatic assembly cell and processes. Automation of the BB assembly will have a significant positive impact on nuclear safety, quality, productivity, and reliability.

Keywords: used fuel packing plant, automatic assembly cell, used fuel container, buffer box, deep geological repository

Procedia PDF Downloads 275
4438 Correction of Frequent English Writing Errors by Using Coded Indirect Corrective Feedback and Error Treatment: The Case of Reading and Writing English for Academic Purposes II

Authors: Chaiwat Tantarangsee

Abstract:

The purposes of this study are 1) to study the frequent English writing errors of students registering the course: Reading and Writing English for Academic Purposes II, and 2) to find out the results of writing error correction by using coded indirect corrective feedback and writing error treatments. Samples include 28 2nd year English Major students, Faculty of Education, Suan Sunandha Rajabhat University. Tool for experimental study includes the lesson plan of the course; Reading and Writing English for Academic Purposes II, and tool for data collection includes 4 writing tests of short texts. The research findings disclose that frequent English writing errors found in this course comprise 7 types of grammatical errors, namely Fragment sentence, Subject-verb agreement, Wrong form of verb tense, Singular or plural noun endings, Run-ons sentence, Wrong form of verb pattern and Lack of parallel structure. Moreover, it is found that the results of writing error correction by using coded indirect corrective feedback and error treatment reveal the overall reduction of the frequent English writing errors and the increase of students’ achievement in the writing of short texts with the significance at .05.

Keywords: coded indirect corrective feedback, error correction, error treatment, English writing

Procedia PDF Downloads 306
4437 Mineral Deposits in Spatial Planning Systems – Review of European Practices

Authors: Alicja Kot-Niewiadomska

Abstract:

Securing sustainable access to raw materials is vital for the growth of the European economy and for the goals laid down in Strategy Europe 2020. One of the most important sources of mineral raw materials are primary deposits. The efficient management of them, including extraction, will ensure competitiveness of the European economy. A critical element of this approach is mineral deposits safeguarding and the most important tool - spatial planning. The safeguarding of deposits should be understood as safeguarding of land access, and safeguarding of area against development, which may (potential) prevent the use of the deposit and the necessary mining activities. Many European Union countries successfully integrated their mineral policy and spatial policy, which has ensured the proper place of mineral deposits in their spatial planning systems. These, in turn, are widely recognized as the most important mineral deposit safeguarding tool, the essence of which is to ensure long-term access to its resources. The examples of Austria, Portugal, Slovakia, Czech Republic, Sweden, and the United Kingdom, discussed in the paper, are often mentioned as examples of good practices in this area. Although none of these countries managed to avoid cases of social and environmental conflicts related to mining activities, the solutions they implement certainly deserve special attention. And for many countries, including Poland, they can be a potential source of solutions aimed at improving the protection of mineral deposits.

Keywords: mineral deposits, land use planning, mineral deposit safeguarding, European practices

Procedia PDF Downloads 171
4436 Comet Assay: A Promising Tool for the Risk Assessment and Clinical Management of Head and Neck Tumors

Authors: Sarim Ahmad

Abstract:

The Single Cell Gel Electrophoresis Assay (SCGE, known as comet assay) is a potential, uncomplicated, sensitive and state-of-the-art technique for quantitating DNA damage at individual cell level and repair from in vivo and in vitro samples of eukaryotic cells and some prokaryotic cells, being popular in its widespread use in various areas including human biomonitoring, genotoxicology, ecological monitoring and as a tool for research into DNA damage or repair in different cell types in response to a range of DNA damaging agents, cancer risk and therapy. The method involves the encapsulation of cells in a low-melting-point agarose suspension, lysis of the cells in neutral or alkaline (pH > 13) conditions, and electrophoresis of the suspended lysed cells, resulting in structures resembling comets as observed by fluorescence microscopy; the intensity of the comet tail relative to the head reflects the number of DNA breaks. The likely basis for this is that loops containing a break lose their supercoiling and become free to extend towards the anode. This is followed by visual analysis with staining of DNA and calculating fluorescence to determine the extent of DNA damage. This can be performed by manual scoring or automatically by imaging software. The assay can, therefore, predict an individual’s tumor sensitivity to radiation and various chemotherapeutic drugs and further assess the oxidative stress within tumors and to detect the extent of DNA damage in various cancerous and precancerous lesions of oral cavity.

Keywords: comet assay, single cell gel electrophoresis, DNA damage, early detection test

Procedia PDF Downloads 292
4435 Generation of Automated Alarms for Plantwide Process Monitoring

Authors: Hyun-Woo Cho

Abstract:

Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.

Keywords: detection, monitoring, process data, noise

Procedia PDF Downloads 252
4434 TACTICAL: Ram Image Retrieval in Linux Using Protected Mode Architecture’s Paging Technique

Authors: Sedat Aktas, Egemen Ulusoy, Remzi Yildirim

Abstract:

This article explains how to get a ram image from a computer with a Linux operating system and what steps should be followed while getting it. What we mean by taking a ram image is the process of dumping the physical memory instantly and writing it to a file. This process can be likened to taking a picture of everything in the computer’s memory at that moment. This process is very important for tools that analyze ram images. Volatility can be given as an example because before these tools can analyze ram, images must be taken. These tools are used extensively in the forensic world. Forensic, on the other hand, is a set of processes for digitally examining the information on any computer or server on behalf of official authorities. In this article, the protected mode architecture in the Linux operating system is examined, and the way to save the image sample of the kernel driver and system memory to disk is followed. Tables and access methods to be used in the operating system are examined based on the basic architecture of the operating system, and the most appropriate methods and application methods are transferred to the article. Since there is no article directly related to this study on Linux in the literature, it is aimed to contribute to the literature with this study on obtaining ram images. LIME can be mentioned as a similar tool, but there is no explanation about the memory dumping method of this tool. Considering the frequency of use of these tools, the contribution of the study in the field of forensic medicine has been the main motivation of the study due to the intense studies on ram image in the field of forensics.

Keywords: linux, paging, addressing, ram-image, memory dumping, kernel modules, forensic

Procedia PDF Downloads 117
4433 River Offtake Management Using Mathematical Modelling Tool: A Case Study of the Gorai River, Bangladesh

Authors: Sarwat Jahan, Asker Rajin Rahman

Abstract:

Management of offtake of any fluvial river is very sensitive in terms of long-term sustainability where the variation of water flow and sediment transport range are wide enough throughout a hydrological year. The Gorai River is a major distributary of the Ganges River in Bangladesh and is termed as a primary source of fresh water for the South-West part of the country. Every year, significant siltation of the Gorai offtake disconnects it from the Ganges during the dry season. As a result, the socio-economic and environmental condition of the downstream areas has been deteriorating for a few decades. To improve the overall situation of the Gorai offtake and its dependent areas, a study has been conducted by the Institute of Water Modelling, Bangladesh, in 2022. Using the mathematical morphological modeling tool MIKE 21C of DHI Water & Environment, Denmark, simulated results revealed the need for dredging/river training structures for offtake management at the Gorai offtake to ensure significant dry season flow towards the downstream. The dry season flow is found to increase significantly with the proposed river interventions, which also improves the environmental conditions in terms of salinity of the South-West zone of the country. This paper summarizes the primary findings of the analyzed results of the developed mathematical model for improving the existing condition of the Gorai River.

Keywords: Gorai river, mathematical modelling, offtake, siltation, salinity

Procedia PDF Downloads 97
4432 Improving the Detection of Depression in Sri Lanka: Cross-Sectional Study Evaluating the Efficacy of a 2-Question Screen for Depression

Authors: Prasad Urvashi, Wynn Yezarni, Williams Shehan, Ravindran Arun

Abstract:

Introduction: Primary health services are often the first point of contact that patients with mental illness have with the healthcare system. A number of tools have been developed to increase detection of depression in the context of primary care. However, one challenge amongst many includes utilizing these tools within the limited primary care consultation timeframe. Therefore, short questionnaires that screen for depression that are just as effective as more comprehensive diagnostic tools may be beneficial in improving detection rates of patients visiting a primary care setting. Objective: To develop and determine the sensitivity and specificity of a 2-Question Questionnaire (2-QQ) to screen for depression in in a suburban primary care clinic in Ragama, Sri Lanka. The purpose is to develop a short screening tool for depression that is culturally adapted in order to increase the detection of depression in the Sri Lankan patient population. Methods: This was a cross-sectional study involving two steps. Step one: verbal administration of 2-QQ to patients by their primary care physician. Step two: completion of the Peradeniya Depression Scale, a validated diagnostic tool for depression, the patient after their consultation with the primary care physician. The results from the PDS were then correlated to the results from the 2-QQ for each patient to determine sensitivity and specificity of the 2-QQ. Results: A score of 1/+ on the 2-QQ was most sensitive but least specific. Thus, setting the threshold at this level is effective for correctly identifying depressed patients, but also inaccurately captures patients who are not depressed. A score of 6 on the 2-QQ was most specific but least sensitive. Setting the threshold at this level is effective for correctly identifying patients without depression, but not very effective at capturing patients with depression. Discussion: In the context of primary care, it may be worthwhile setting the 2-QQ screen at a lower threshold for positivity (such as a score of 1 or above). This would generate a high test sensitivity and thus capture the majority of patients that have depression. On the other hand, by setting a low threshold for positivity, patients who do not have depression but score higher than 1 on the 2-QQ will also be falsely identified as testing positive for depression. However, the benefits of identifying patients who present with depression may outweigh the harms of falsely identifying a non-depressed patient. It is our hope that the 2-QQ will serve as a quick primary screen for depression in the primary care setting and serve as a catalyst to identify and treat individuals with depression.

Keywords: depression, primary care, screening tool, Sri Lanka

Procedia PDF Downloads 257
4431 Automated Marker Filling System

Authors: Pinisetti Swami Sairam, Meera C. S.

Abstract:

Marker pens are widely used all over the world, mainly in educational institutions due to their neat, accurate and easily erasable nature. But refilling the ink in these pens is a tedious and time consuming job. Besides, it requires careful handling of the pens and ink bottle. A fully automated marker filling system is a solution developed to overcome this problem. The system comprises of pneumatics and electronics modules as well as PLC control. The system design is done in such a way that the empty markers are dumped in a marker container which then sent through different modules of the system in order to refill it automatically. The filled markers are then collected in a marker container. Refilling of ink takes place in different stages inside the system. An ink detecting system detects the colour of the marker which is to be filled and then refilling is done. The processes like capping and uncapping of the cap as well as screwing and unscrewing of the tip are done with the help of robotic arm and gripper. We make use of pneumatics in this system in order to get the precision while performing the capping, screwing, and refilling operations. Thus with the help of this system we can achieve cleanliness, accuracy, effective and time saving in the process of filling a marker.

Keywords: automated system, market filling, information technology, control and automation

Procedia PDF Downloads 497
4430 Reasonableness to Strengthen Citizen Participation in Mexican Anti-Corruption Policies

Authors: Amós García Montaño

Abstract:

In a democracy, a public policy must be developed within the regulatory framework and considering citizen participation in its planning, design, execution, and evaluation stages, necessary factors to have both legal support and sufficient legitimacy for its operation. However, the complexity and magnitude of certain public problems results in difficulties for the generation of consensus among society members, leading to unstable and unsuccessful scenarios for the exercise of the right to citizen participation and the generation of effective and efficient public policies. This is the case of public policies against corruption, an issue that in Mexico is difficult to define and generates conflicting opinions. To provide a possible solution to this delicate reality, this paper analyzes the principle of reasonableness as a tool for identifying the basic elements that guarantee a fundamental level of the exercise of the right to citizen participation in the fight against corruption, adopting elements of human rights indicator methodologies. In this sense, the relevance of having a legal framework that establishes obligations to incorporate proactive and transversal citizen participation in the matter is observed. It is also noted the need to monitor the operation of various citizen participation mechanisms in the decision-making processes of the institutions involved in the fight and prevention of corruption, which lead to an increase in the improvement of the perception of the citizen role as a relevant actor in this field. It is concluded that the principle of reasonableness is presented as a very useful tool for the identification of basic elements that facilitate the fulfillment of human rights commitments in the field of public policies.

Keywords: anticorruption, public participation, public policies, reasonableness

Procedia PDF Downloads 82
4429 Measuring Government’s Performance (Services) Oman Service Maturity Model (OSMM)

Authors: Angie Al Habib, Khalid Al Siyabi

Abstract:

To measure or asses any government’s efficiency we need to measure the performance of this government in regards to the quality of the service it provides. Using a technological platform in service provision became a trend and a public demand. It is also a public need to make sure these services are aligned to values and to the whole government’s strategy, vision and goals as well. Providing services using technology tools and channels can enhance the internal business process and also help establish many essential values to government services like transparency and excellence, since in order to establish e-services many standards and policies must be put in place to enable the handing over of decision making to a mature system oriented mechanism. There was no doubt that the Sultanate of Oman wanted to enhance its services and move it towards automation and establishes a smart government as well as links its services to life events. Measuring government efficiency is very essential in achieving social security and economic growth, since it can provide a clear dashboard of all projects and improvements. Based on this data we can improve the strategies and align the country goals to them.

Keywords: government, maturity, Oman, performance, service

Procedia PDF Downloads 366
4428 Neural Network Models for Actual Cost and Actual Duration Estimation in Construction Projects: Findings from Greece

Authors: Panagiotis Karadimos, Leonidas Anthopoulos

Abstract:

Predicting the actual cost and duration in construction projects concern a continuous and existing problem for the construction sector. This paper addresses this problem with modern methods and data available from past public construction projects. 39 bridge projects, constructed in Greece, with a similar type of available data were examined. Considering each project’s attributes with the actual cost and the actual duration, correlation analysis is performed and the most appropriate predictive project variables are defined. Additionally, the most efficient subgroup of variables is selected with the use of the WEKA application, through its attribute selection function. The selected variables are used as input neurons for neural network models through correlation analysis. For constructing neural network models, the application FANN Tool is used. The optimum neural network model, for predicting the actual cost, produced a mean squared error with a value of 3.84886e-05 and it was based on the budgeted cost and the quantity of deck concrete. The optimum neural network model, for predicting the actual duration, produced a mean squared error with a value of 5.89463e-05 and it also was based on the budgeted cost and the amount of deck concrete.

Keywords: actual cost and duration, attribute selection, bridge construction, neural networks, predicting models, FANN TOOL, WEKA

Procedia PDF Downloads 134
4427 Technical, Environmental and Financial Assessment for Optimal Sizing of Run-of-River Small Hydropower Project: Case Study in Colombia

Authors: David Calderon Villegas, Thomas Kaltizky

Abstract:

Run-of-river (RoR) hydropower projects represent a viable, clean, and cost-effective alternative to dam-based plants and provide decentralized power production. However, RoR schemes cost-effectiveness depends on the proper selection of site and design flow, which is a challenging task because it requires multivariate analysis. In this respect, this study presents the development of an investment decision support tool for assessing the optimal size of an RoR scheme considering the technical, environmental, and cost constraints. The net present value (NPV) from a project perspective is used as an objective function for supporting the investment decision. The tool has been tested by applying it to an actual RoR project recently proposed in Colombia. The obtained results show that the optimum point in financial terms does not match the flow that maximizes energy generation from exploiting the river's available flow. For the case study, the flow that maximizes energy corresponds to a value of 5.1 m3/s. In comparison, an amount of 2.1 m3/s maximizes the investors NPV. Finally, a sensitivity analysis is performed to determine the NPV as a function of the debt rate changes and the electricity prices and the CapEx. Even for the worst-case scenario, the optimal size represents a positive business case with an NPV of 2.2 USD million and an IRR 1.5 times higher than the discount rate.

Keywords: small hydropower, renewable energy, RoR schemes, optimal sizing, objective function

Procedia PDF Downloads 132
4426 Autonomy not Automation: Using Metacognitive Skills in ESL/EFL Classes

Authors: Marina Paula Carreira Rolim

Abstract:

In order to have ELLs take responsibility for their own learning, it is important that they develop skills to work their studies strategically. The less they rely on the instructor as the content provider, the more they become active learners and have a higher sense of self-regulation and confidence in the learning process. This e-poster proposes a new teacher-student relationship that encourages learners to reflect, think critically, and act upon their realities. It also suggests the implementation of different autonomy-supportive teaching tools, such as portfolios, written journals, problem-solving activities, and strategy-based discussions in class. These teaching tools enable ELLs to develop awareness of learning strategies, learning styles, study plans, and available learning resources as means to foster their creative power of learning outside of classroom. In the role of a learning advisor, the teacher is no longer the content provider but a facilitator that introduces skills such as ‘elaborating’, ‘planning’, ‘monitoring’, and ‘evaluating’. The teacher acts as an educator and promotes the use of lifelong metacognitive skills to develop learner autonomy in the ESL/EFL context.

Keywords: autonomy, metacognitive skills, self-regulation, learning strategies, reflection

Procedia PDF Downloads 367
4425 Key Parameters Analysis of the Stirring Systems in the Optmization Procedures

Authors: T. Gomes, J. Manzi

Abstract:

The inclusion of stirring systems in the calculation and optimization procedures has been undergone a significant lack of attention, what it can reflect in the results because such systems provide an additional energy to the process, besides promote a better distribution of mass and energy. This is meaningful for the reactive systems, particularly for the Continuous Stirred Tank Reactor (CSTR), for which the key variables and parameters, as well as the operating conditions of stirring systems, can play a pivotal role and it has been showed in the literature that neglect these factors can lead to sub-optimal results. It is also well known that the sole use of the First Law of Thermodynamics as an optimization tool cannot yield satisfactory results, since the joint use of the First and Second Laws condensed into a procedure so-called entropy generation minimization (EGM) has shown itself able to drive the system towards better results. Therefore, the main objective of this paper is to determine the effects of key parameters of the stirring system in the optimization procedures by means of EGM applied to the reactive systems. Such considerations have been possible by dimensional analysis according to Rayleigh and Buckingham's method, which takes into account the physical and geometric parameters and the variables of the reactive system. For the simulation purpose based on the production of propylene glycol, the results have shown a significant increase in the conversion rate from 36% (not-optimized system) to 95% (optimized system) with a consequent reduction of by-products. In addition, it has been possible to establish the influence of the work of the stirrer in the optimization procedure, in which can be described as a function of the fluid viscosity and consequently of the temperature. The conclusions to be drawn also indicate that the use of the entropic analysis as optimization tool has been proved to be simple, easy to apply and requiring low computational effort.

Keywords: stirring systems, entropy, reactive system, optimization

Procedia PDF Downloads 246
4424 Surface Roughness in the Incremental Forming of Drawing Quality Cold Rolled CR2 Steel Sheet

Authors: Zeradam Yeshiwas, A. Krishnaia

Abstract:

The aim of this study is to verify the resulting surface roughness of parts formed by the Single-Point Incremental Forming (SPIF) process for an ISO 3574 Drawing Quality Cold Rolled CR2 Steel. The chemical composition of drawing quality Cold Rolled CR2 steel is comprised of 0.12 percent of carbon, 0.5 percent of manganese, 0.035 percent of sulfur, 0.04 percent phosphorous, and the remaining percentage is iron with negligible impurities. The experiments were performed on a 3-axis vertical CNC milling machining center equipped with a tool setup comprising a fixture and forming tools specifically designed and fabricated for the process. The CNC milling machine was used to transfer the tool path code generated in Mastercam 2017 environment into three-dimensional motions by the linear incremental progress of the spindle. The blanks of Drawing Quality Cold Rolled CR2 steel sheets of 1 mm of thickness have been fixed along their periphery by a fixture and hardened high-speed steel (HSS) tools with a hemispherical tip of 8, 10 and 12mm of diameter were employed to fabricate sample parts. To investigate the surface roughness, hyperbolic-cone shape specimens were fabricated based on the chosen experimental design. The effect of process parameters on the surface roughness was studied using three important process parameters, i.e., tool diameter, feed rate, and step depth. In this study, the Taylor-Hobson Surtronic 3+ surface roughness tester profilometer was used to determine the surface roughness of the parts fabricated using the arithmetic mean deviation (Rₐ). In this instrument, a small tip is dragged across a surface while its deflection is recorded. Finally, the optimum process parameters and the main factor affecting surface roughness were found using the Taguchi design of the experiment and ANOVA. A Taguchi experiment design with three factors and three levels for each factor, the standard orthogonal array L9 (3³) was selected for the study using the array selection table. The lowest value of surface roughness is significant for surface roughness improvement. For this objective, the ‘‘smaller-the-better’’ equation was used for the calculation of the S/N ratio. The finishing roughness parameter Ra has been measured for the different process combinations. The arithmetic means deviation (Rₐ) was measured via the experimental design for each combination of the control factors by using Taguchi experimental design. Four roughness measurements were taken for a single component and the average roughness was taken to optimize the surface roughness. The lowest value of Rₐ is very important for surface roughness improvement. For this reason, the ‘‘smaller-the-better’’ Equation was used for the calculation of the S/N ratio. Analysis of the effect of each control factor on the surface roughness was performed with a ‘‘S/N response table’’. Optimum surface roughness was obtained at a feed rate of 1500 mm/min, with a tool radius of 12 mm, and with a step depth of 0.5 mm. The ANOVA result shows that step depth is an essential factor affecting surface roughness (91.1 %).

Keywords: incremental forming, SPIF, drawing quality steel, surface roughness, roughness behavior

Procedia PDF Downloads 62
4423 Multi-Objectives Genetic Algorithm for Optimizing Machining Process Parameters

Authors: Dylan Santos De Pinho, Nabil Ouerhani

Abstract:

Energy consumption of machine-tools is becoming critical for machine-tool builders and end-users because of economic, ecological and legislation-related reasons. Many machine-tool builders are seeking for solutions that allow the reduction of energy consumption of machine-tools while preserving the same productivity rate and the same quality of machined parts. In this paper, we present the first results of a project conducted jointly by academic and industrial partners to reduce the energy consumption of a Swiss-Type lathe. We employ genetic algorithms to find optimal machining parameters – the set of parameters that lead to the best trade-off between energy consumption, part quality and tool lifetime. Three main machining process parameters are considered in our optimization technique, namely depth of cut, spindle rotation speed and material feed rate. These machining process parameters have been identified as the most influential ones in the configuration of the Swiss-type machining process. A state-of-the-art multi-objective genetic algorithm has been used. The algorithm combines three fitness functions, which are objective functions that permit to evaluate a set of parameters against the three objectives: energy consumption, quality of the machined parts, and tool lifetime. In this paper, we focus on the investigation of the fitness function related to energy consumption. Four different energy consumption related fitness functions have been investigated and compared. The first fitness function refers to the Kienzle cutting force model. The second fitness function uses the Material Removal Rate (RMM) as an indicator of energy consumption. The two other fitness functions are non-deterministic, learning-based functions. One fitness function uses a simple Neural Network to learn the relation between the process parameters and the energy consumption from experimental data. Another fitness function uses Lasso regression to determine the same relation. The goal is, then, to find out which fitness functions predict best the energy consumption of a Swiss-Type machining process for the given set of machining process parameters. Once determined, these functions may be used for optimization purposes – determine the optimal machining process parameters leading to minimum energy consumption. The performance of the four fitness functions has been evaluated. The Tornos DT13 Swiss-Type Lathe has been used to carry out the experiments. A mechanical part including various Swiss-Type machining operations has been selected for the experiments. The evaluation process starts with generating a set of CNC (Computer Numerical Control) programs for machining the part at hand. Each CNC program considers a different set of machining process parameters. During the machining process, the power consumption of the spindle is measured. All collected data are assigned to the appropriate CNC program and thus to the set of machining process parameters. The evaluation approach consists in calculating the correlation between the normalized measured power consumption and the normalized power consumption prediction for each of the four fitness functions. The evaluation shows that the Lasso and Neural Network fitness functions have the highest correlation coefficient with 97%. The fitness function “Material Removal Rate” (MRR) has a correlation coefficient of 90%, whereas the Kienzle-based fitness function has a correlation coefficient of 80%.

Keywords: adaptive machining, genetic algorithms, smart manufacturing, parameters optimization

Procedia PDF Downloads 147
4422 Current Status of Industry 4.0 in Material Handling Automation and In-house Logistics

Authors: Orestis Κ. Efthymiou, Stavros T. Ponis

Abstract:

In the last decade, a new industrial revolution seems to be emerging, supported -once again- by the rapid advancements of Information Technology in the areas of Machine-to-Machine (M2M) communication permitting large numbers of intelligent devices, e.g. sensors to communicate with each other and take decisions without any or minimum indirect human intervention. The advent of these technologies have triggered the emergence of a new category of hybrid (cyber-physical) manufacturing systems, combining advanced manufacturing techniques with innovative M2M applications based on the Internet of Things (IoT), under the umbrella term Industry 4.0. Even though the topic of Industry 4.0 has attracted much attention during the last few years, the attempts of providing a systematic literature review of the subject are scarce. In this paper, we present the authors’ initial study of the field with a special focus on the use and applications of Industry 4.0 principles in material handling automations and in-house logistics. Research shows that despite the vivid discussion and attractiveness of the subject, there are still many challenges and issues that have to be addressed before Industry 4.0 becomes standardized and widely applicable.

Keywords: Industry 4.0, internet of things, manufacturing systems, material handling, logistics

Procedia PDF Downloads 127
4421 Case Study: Optimization of Contractor’s Financing through Allocation of Subcontractors

Authors: Helen S. Ghali, Engy Serag, A. Samer Ezeldin

Abstract:

In many countries, the construction industry relies heavily on outsourcing models in executing their projects and expanding their businesses to fit in the diverse market. Such extensive integration of subcontractors is becoming an influential factor in contractor’s cash flow management. Accordingly, subcontractors’ financial terms are important phenomena and pivotal components for the well-being of the contractor’s cash flow. The aim of this research is to study the contractor’s cash flow with respect to the owner and subcontractor’s payment management plans, considering variable advance payment, payment frequency, and lag and retention policies. The model is developed to provide contractors with a decision support tool that can assist in selecting the optimum subcontracting plan to minimize the contractor’s financing limits and optimize the profit values. The model is built using Microsoft Excel VBA coding, and the genetic algorithm is utilized as the optimization tool. Three objective functions are investigated, which are minimizing the highest negative overdraft value, minimizing the net present worth of overdraft, and maximizing the project net profit. The model is validated on a full-scale project which includes both self-performed and subcontracted work packages. The results show potential outputs in optimizing the contractor’s negative cash flow values and, in the meantime, assisting contractors in selecting suitable subcontractors to achieve the objective function.

Keywords: cash flow optimization, payment plan, procurement management, subcontracting plan

Procedia PDF Downloads 131
4420 The Holistic Nursing WebQuest: An Interactive Teaching/Learning Strategy

Authors: Laura M. Schwarz

Abstract:

WebQuests are an internet-based interactive teaching/learning tool and utilize a scaffolded methodology. WebQuests employ critical thinking, afford inquiry-based constructivist learning, and readily employ Bloom’s Taxonomy. WebQuests have generally been used as instructional technology tools in primary and secondary education and have more recently grown in popularity in higher education. The study of the efficacy of WebQuests as an instructional approach to learning, however, has been limited, particularly in the nursing education arena. The purpose of this mixed-methods study was to determine nursing students’ perceptions of the effectiveness of the Nursing WebQuest as a teaching/learning strategy for holistic nursing-related content. Quantitative findings (N=42) suggested that learners were active participants, used reflection, thought of new ideas, used analysis skills, discovered something new, and assessed the worth of something while taking part in the WebQuests. Qualitative findings indicated that participants found WebQuest positives as easy to understand and navigate; clear and organized; interactive; good alternative learning format, and used a variety of quality resources. Participants saw drawbacks as requiring additional time and work; and occasional failed link or link causing them to lose their location in the WebQuest. Recommendations include using larger sample size and more diverse populations from various programs and universities. In conclusion, WebQuests were found to be an effective teaching/learning tool as positively assessed by study participants.

Keywords: holistic nursing, nursing education, teaching/learning strategy, WebQuests

Procedia PDF Downloads 126
4419 Performance Evaluation of Sand Casting Manufacturing Plant with WITNESS

Authors: Aniruddha Joshi

Abstract:

This paper discusses a simulation study of automated sand casting production system. Therefore, the first aims of this study is development of automated sand casting process model and analyze this model with a simulation software Witness. Production methodology aims to improve overall productivity through elimination of wastes and that leads to improve quality. Integration of automation with Simulation is beneficial to identify the obstacles in implementation and to take appropriate options to implement successfully. For this integration, there are different Simulation Software’s. To study this integration, with the help of “WITNESS” Simulation Software the model is created. This model is based on literature review. The input parameters are Setup Time, Number of machines, cycle time and output parameter is number of castings, avg, and time and percentage usage of machines. Obtained results are used for Statistical Analysis. This analysis concludes the optimal solution to get maximum output.

Keywords: automated sand casting production system, simulation, WITNESS software, performance evaluation

Procedia PDF Downloads 789
4418 Cheiloscopy: A Study on Predominant Lip Print Patterns among the Gujarati Population

Authors: Pooja Ahuja, Tejal Bhutani, M. S. Dahiya

Abstract:

Cheiloscopy, the study of lip prints, is a tool in forensic investigation technique that deals with identification of individuals based on lips patterns. The objective of this study is to determine predominant lip print pattern found among the Gujarati population, to evaluate whether any sex difference exists and to study the permanence of the pattern over six months duration. The study comprised of 100 healthy individuals (50 males and 50 females), in the age group of 18 to 25 years of Gujarati population of the Gandhinagar region of the Gujarat state, India. By using Suzuki and Tsuchihashi classification, Lip prints were then divided into four quadrants and also classified on the basis of peripheral shape of the lips. Materials used to record the lip prints were dark brown colored lipstick, cellophane tape, and white bond paper. Lipstick was applied uniformly, and lip prints were taken on the glued portion of cellophane tape and then stuck on to a white bond paper. These lip prints were analyzed with magnifying lens and virtually with stereo microscope. On the analysis of the subject population, results showed Branched pattern Type II (29.57 percentage) to be most predominant in the Gujarati population. Branched pattern Type II (35.60 percentage) and long vertical Type I (28.28 percentage) were most prevalent in males and females respectively and large full lips were most predominantly present in both the sexes. The study concludes that lip prints in any form can be an effective tool for identification of an individual in a closed or open group forms.

Keywords: cheiloscopy, lip pattern, predomianant, Gujarati population

Procedia PDF Downloads 298
4417 The End Justifies the Means: Using Programmed Mastery Drill to Teach Spoken English to Spanish Youngsters, without Relying on Homework

Authors: Robert Pocklington

Abstract:

Most current language courses expect students to be ‘vocational’, sacrificing their free time in order to learn. However, pupils with a full-time job, or bringing up children, hardly have a spare moment. Others just need the language as a tool or a qualification, as if it were book-keeping or a driving license. Then there are children in unstructured families whose stressful life makes private study almost impossible. And the countless parents whose evenings and weekends have become a nightmare, trying to get the children to do their homework. There are many arguments against homework being a necessity (rather than an optional extra for more ambitious or dedicated students), making a clear case for teaching methods which facilitate full learning of the key content within the classroom. A methodology which could be described as Programmed Mastery Learning has been used at Fluency Language Academy (Spain) since 1992, to teach English to over 4000 pupils yearly, with a staff of around 100 teachers, barely requiring homework. The course is structured according to the tenets of Programmed Learning: small manageable teaching steps, immediate feedback, and constant successful activity. For the Mastery component (not stopping until everyone has learned), the memorisation and practice are entrusted to flashcard-based drilling in the classroom, leading all students to progress together and develop a permanently growing knowledge base. Vocabulary and expressions are memorised using flashcards as stimuli, obliging the brain to constantly recover words from the long-term memory and converting them into reflex knowledge, before they are deployed in sentence building. The use of grammar rules is practised with ‘cue’ flashcards: the brain refers consciously to the grammar rule each time it produces a phrase until it comes easily. This automation of lexicon and correct grammar use greatly facilitates all other language and conversational activities. The full B2 course consists of 48 units each of which takes a class an average of 17,5 hours to complete, allowing the vast majority of students to reach B2 level in 840 class hours, which is corroborated by an 85% pass-rate in the Cambridge University B2 exam (First Certificate). In the past, studying for qualifications was just one of many different options open to young people. Nowadays, youngsters need to stay at school and obtain qualifications in order to get any kind of job. There are many students in our classes who have little intrinsic interest in what they are studying; they just need the certificate. In these circumstances and with increasing government pressure to minimise failure, teachers can no longer think ‘If they don’t study, and fail, its their problem’. It is now becoming the teacher’s problem. Teachers are ever more in need of methods which make their pupils successful learners; this means assuring learning in the classroom. Furthermore, homework is arguably the main divider between successful middle-class schoolchildren and failing working-class children who drop out: if everything important is learned at school, the latter will have a much better chance, favouring inclusiveness in the language classroom.

Keywords: flashcard drilling, fluency method, mastery learning, programmed learning, teaching English as a foreign language

Procedia PDF Downloads 110