Search results for: LCA tools and data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27281

Search results for: LCA tools and data

27161 Disrupted or Discounted Cash Flow: Impact of Digitisation on Business Valuation

Authors: Matthias Haerri, Tobias Huettche, Clemens Kustner

Abstract:

This article discusses the impact of digitization on business valuation. In order to become and remain ‘digital’, investments are necessary whose return on investment (ROI) often remains vague. This uncertainty is contradictory for a valuation, that rely on predictable cash flows, fixed capital structures and the steady state. However digitisation does not make a company valuation impossible, but traditional approaches must be reconsidered. The authors identify four areas that are to be changing: (1) Tools instead of intuition - In the future, company valuation will neither be art nor science, but craft. This does not require intuition, but experience and good tools. Digital evaluation tools beyond Excel will therefore gain in importance. (2) Real-time instead of deadline - At present, company valuations are always carried out on a case-by-case basis and on a specific key date. This will change with the digitalization and the introduction of web-based valuation tools. Company valuations can thus not only be carried out faster and more efficiently, but can also be offered more frequently. Instead of calculating the value for a previous key date, current and real-time valuations can be carried out. (3) Predictive planning instead of analysis of the past - Past data will also be needed in the future, but its use will not be limited to monovalent time series or key figure analyses. With pictures of ‘black swans’ and the ‘turkey illusion’ it was made clear to us that we build forecasts on too few data points of the past and underestimate the power of chance. Predictive planning can help here. (4) Convergence instead of residual value - Digital transformation shortens the lifespan of viable business models. If companies want to live forever, they have to change forever. For the company valuation, this means that the business model valid on the valuation date only has a limited service life.

Keywords: business valuation, corporate finance, digitisation, disruption

Procedia PDF Downloads 130
27160 An Observation of the Information Technology Research and Development Based on Article Data Mining: A Survey Study on Science Direct

Authors: Muhammet Dursun Kaya, Hasan Asil

Abstract:

One of the most important factors of research and development is the deep insight into the evolutions of scientific development. The state-of-the-art tools and instruments can considerably assist the researchers, and many of the world organizations have become aware of the advantages of data mining for the acquisition of the knowledge required for the unstructured data. This paper was an attempt to review the articles on the information technology published in the past five years with the aid of data mining. A clustering approach was used to study these articles, and the research results revealed that three topics, namely health, innovation, and information systems, have captured the special attention of the researchers.

Keywords: information technology, data mining, scientific development, clustering

Procedia PDF Downloads 274
27159 The Introduction of Modern Diagnostic Techniques and It Impact on Local Garages

Authors: Mustapha Majid

Abstract:

Gone were the days when technicians/mechanics will have to spend too much time trying to identify a mechanical fault and rectify the problem. Now the emphasis is on the use of Automobile diagnosing Equipment through the use of computers and special software. An investigation conducted at Tamale Metropolis and Accra in the Northern and Greater Accra regions of Ghana, respectively. Methodology for data gathering were; questionnaires, physical observation, interviews, and newspaper. The study revealed that majority of mechanics lack computer skills which can enable them use diagnosis tools such as Exhaust Gas Analyzer, Scan Tools, Electronic Wheel Balancing machine, etc.

Keywords: diagnosing, local garages and modern garages, lack of knowledge of diagnosing posing an existential threat, training of local mechanics

Procedia PDF Downloads 156
27158 A Study of the Performance Parameter for Recommendation Algorithm Evaluation

Authors: C. Rana, S. K. Jain

Abstract:

The enormous amount of Web data has challenged its usage in efficient manner in the past few years. As such, a range of techniques are applied to tackle this problem; prominent among them is personalization and recommender system. In fact, these are the tools that assist user in finding relevant information of web. Most of the e-commerce websites are applying such tools in one way or the other. In the past decade, a large number of recommendation algorithms have been proposed to tackle such problems. However, there have not been much research in the evaluation criteria for these algorithms. As such, the traditional accuracy and classification metrics are still used for the evaluation purpose that provides a static view. This paper studies how the evolution of user preference over a period of time can be mapped in a recommender system using a new evaluation methodology that explicitly using time dimension. We have also presented different types of experimental set up that are generally used for recommender system evaluation. Furthermore, an overview of major accuracy metrics and metrics that go beyond the scope of accuracy as researched in the past few years is also discussed in detail.

Keywords: collaborative filtering, data mining, evolutionary, clustering, algorithm, recommender systems

Procedia PDF Downloads 407
27157 Methodology for Various Sand Cone Testing

Authors: Abel S. Huaynacho, Yoni D. Huaynacho

Abstract:

The improvement of procedure test ASTM D1556, plays an important role in the developing of testing in field to obtain a higher quality of data QA/QC. The traditional process takes a considerable amount of time for only one test. Even making various testing are tasks repeating and it takes a long time to obtain better results. Moreover, if the adequate tools the help these testing are not properly managed, the improvement in the development for various testing could be stooped. This paper presents an optimized process for various testing ASTM D1556 which uses an initial standard process to another one the uses a simpler and improved management tools.

Keywords: cone sand test, density bulk, ASTM D1556, QA/QC

Procedia PDF Downloads 132
27156 Application of Social Media for Promoting Library and Information Services: A Case Study of Library Science Professionals of India

Authors: Payel Saha

Abstract:

Social media is playing an important role for dissemination of information in society. In 21st century most people have a smart phone and used different social media tools like Facebook, Twitter, Instagram, WhatsApp, Skype etc. in day to day life. It is rapidly growing web-based tool for everyone to share thoughts, ideas and knowledge globally using internet. The study highlights the current use of social media tools for promoting library and information services of Library and Information Professionals of India, which are working in Library. The study was conducted during November, 2017. A structured questionnaire was prepared using google docs and shared using different mailing list, sent to individual email IDs and sharing with other social media tools. Only 90 responses received from the different states of India and analyzed via MS-Excel. The data receive from 17 states and 3 union territories of India; however most of the respondents has come from the states Odisha 23, Himachal Pradesh 14 and Assam 10. The results revealed that out 90 respondents 37 Female and 53 male categories and also majority of respondents 71 have come from academic library followed by special library 15, Public library 3 and corporate library 1 respondent. The study indicates that, out of 90 respondent’s majority of 53 of respondents said that their Library have a social media account while 39 of respondents have not their Library social media account. The study also inform that Facebook, YouTube, Google+, LinkedIn, Twitter and Instagram are using by the LIS professional of India and Facebook 86 was popular social media tool among the other social media tools. Furthermore, respondent reported that they are using social media tools for sharing photos of events and programs of library 72, followed by tips for using different services 64, posting of new arrivals 56, tutorials of database 35 and send brief updates to patrons 32, announcement of library holidays 22. It was also reported by respondents that they are sharing information about scholarships training programs and marketing of library events etc. The study furthermore identify that lack of time is the major problem while using social media with 53 of respondents followed by low speed of internet 35, too many social media tools to learn 17 and some 3 respondents reported that there is no problem while using social media tools. The results also revealed that, majority of the respondents reported that they are using social media tools in daily basis 71 followed by weekly basis 16. It was followed by monthly 1 respondent and other 2 of the respondents. In summary, this study is expected to be useful in further promoting the social media for dissemination of library and information services to the general public.

Keywords: application of social media, India, promoting library services, library professionals

Procedia PDF Downloads 156
27155 Analysis of Factors Used by Farmers to Manage Risk: A Case Study on Italian Farms

Authors: A. Pontrandolfi, G. Enjolras, F. Capitanio

Abstract:

The study analyses the strategies Italian farmers use to cope with the risks that face their production. We specifically explore the potential and the limitations of the economic tools for climatic risk management in agriculture of the Common Agricultural Policy 2014-2020, that foresees contributions for economic tools for risk management, in relation to farms’ needs, exposure and vulnerability of agricultural areas to climatic risk. We consider at the farm level approaches to hedge risks in terms of the use of technical tools (agricultural practices, pesticides, fertilizers, irrigation) and economic/financial instruments (insurances, etc.). We develop cross-sectional and longitudinal analyses as well as analyses of correlation that underline the main differences between the way farms adapt their structure and management towards risk. The results show a preference for technical tools, despite the presence of important public aids on economic tools such as insurances. Therefore, there is a strong need for a more effective and integrated risk management policy scheme. Synergies between economic tools and risk reduction actions of a more technical, structural and management nature (production diversification, irrigation infrastructures, technological and management innovations and formation-information-consultancy, etc.) are emphasized.

Keywords: agriculture and climate change, climatic risk management, insurance schemes, farmers' approaches to risk management

Procedia PDF Downloads 339
27154 Studies on the Characterization and Machinability of Duplex Stainless Steel 2205 during Dry Turning

Authors: Gaurav D. Sonawane, Vikas G. Sargade

Abstract:

The present investigation is a study of the effect of advanced Physical Vapor Deposition (PVD) coatings on cutting temperature residual stresses and surface roughness during Duplex Stainless Steel (DSS) 2205 turning. Austenite stabilizers like nickel, manganese, and molybdenum reduced the cost of DSS. Surface Integrity (SI) plays an important role in determining corrosion resistance and fatigue life. Resistance to various types of corrosion makes DSS suitable for applications with critical environments like Heat exchangers, Desalination plants, Seawater pipes and Marine components. However, lower thermal conductivity, poor chip control and non-uniform tool wear make DSS very difficult to machine. Cemented carbide tools (M grade) were used to turn DSS in a dry environment. AlTiN and AlTiCrN coatings were deposited using advanced PVD High Pulse Impulse Magnetron Sputtering (HiPIMS) technique. Experiments were conducted with cutting speed of 100 m/min, 140 m/min and 180 m/min. A constant feed and depth of cut of 0.18 mm/rev and 0.8 mm were used, respectively. AlTiCrN coated tools followed by AlTiN coated tools outperformed uncoated tools due to properties like lower thermal conductivity, higher adhesion strength and hardness. Residual stresses were found to be compressive for all the tools used for dry turning, increasing the fatigue life of the machined component. Higher cutting temperatures were observed for coated tools due to its lower thermal conductivity, which results in very less tool wear than uncoated tools. Surface roughness with uncoated tools was found to be three times higher than coated tools due to lower coefficient of friction of coating used.

Keywords: cutting temperature, DSS2205, dry turning, HiPIMS, surface integrity

Procedia PDF Downloads 128
27153 Anomaly Detection Based on System Log Data

Authors: M. Kamel, A. Hoayek, M. Batton-Hubert

Abstract:

With the increase of network virtualization and the disparity of vendors, the continuous monitoring and detection of anomalies cannot rely on static rules. An advanced analytical methodology is needed to discriminate between ordinary events and unusual anomalies. In this paper, we focus on log data (textual data), which is a crucial source of information for network performance. Then, we introduce an algorithm used as a pipeline to help with the pretreatment of such data, group it into patterns, and dynamically label each pattern as an anomaly or not. Such tools will provide users and experts with continuous real-time logs monitoring capability to detect anomalies and failures in the underlying system that can affect performance. An application of real-world data illustrates the algorithm.

Keywords: logs, anomaly detection, ML, scoring, NLP

Procedia PDF Downloads 91
27152 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R

Authors: Pavel H. Llamocca, Victoria Lopez

Abstract:

The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.

Keywords: open data, R language, data integration, environmental data

Procedia PDF Downloads 311
27151 Potential of Detailed Environmental Data, Produced by Information and Communication Technology Tools, for Better Consideration of Microclimatology Issues in Urban Planning to Promote Active Mobility

Authors: Živa Ravnikar, Alfonso Bahillo Martinez, Barbara Goličnik Marušić

Abstract:

Climate change mitigation has been formally adopted and announced by countries over the globe, where cities are targeting carbon neutrality through various more or less successful, systematic, and fragmentary actions. The article is based on the fact that environmental conditions affect human comfort and the usage of space. Urban planning can, with its sustainable solutions, not only support climate mitigation in terms of a planet reduction of global warming but as well enabling natural processes that in the immediate vicinity produce environmental conditions that encourage people to walk or cycle. However, the article draws attention to the importance of integrating climate consideration into urban planning, where detailed environmental data play a key role, enabling urban planners to improve or monitor environmental conditions on cycle paths. In a practical aspect, this paper tests a particular ICT tool, a prototype used for environmental data. Data gathering was performed along the cycling lanes in Ljubljana (Slovenia), where the main objective was to assess the tool's data applicable value within the planning of comfortable cycling lanes. The results suggest that such transportable devices for in-situ measurements can help a researcher interpret detailed environmental information, characterized by fine granularity and precise data spatial and temporal resolution. Data can be interpreted within human comfort zones, where graphical representation is in the form of a map, enabling the link of the environmental conditions with a spatial context. The paper also provides preliminary results in terms of the potential of such tools for identifying the correlations between environmental conditions and different spatial settings, which can help urban planners to prioritize interventions in places. The paper contributes to multidisciplinary approaches as it demonstrates the usefulness of such fine-grained data for better consideration of microclimatology in urban planning, which is a prerequisite for creating climate-comfortable cycling lanes promoting active mobility.

Keywords: information and communication technology tools, urban planning, human comfort, microclimate, cycling lanes

Procedia PDF Downloads 131
27150 Experiential Learning in an Earthquake Engineering Course Using Online Tools and Shake Table Exercises

Authors: Andres Winston Oreta

Abstract:

Experiential Learning (ELE) is a strategy for enhancing the teaching and learning of courses especially in civil engineering. This paper presents the adaption of the ELE framework in the delivery of various course requirements in an earthquake engineering course. Examples of how ELE is integrated using online tools and hands-on laboratory technology to address the course learning outcomes on earthquake engineering are presented. Student feedback shows that ELE using online tools and technology strengthens students’ understanding and intuition of seismic design and earthquake engineering concepts.

Keywords: earthquake engineering, experiential learning, shake table, online, internet, civil engineering

Procedia PDF Downloads 17
27149 Urban Gamification: Analyzing the Effects of UFLab’s Tangible Gamified Tools in Four Hungarian Urban Public Participation Processes

Authors: Olivia Kurucz

Abstract:

Gamification is one of the outstanding new methodological possibilities of urban public participation processes to make the most informed decision possible for the future steps of urban development. This paper examines four Hungarian experimental projects in which gamified tools were applied during the public participation progresses by the Urban Future Laboratory (UFLab) research workshop of Budapest University of Technology and Economics (BUTE). The recently implemented future planning projects (in the cities of Pécel, Kistarcsa, Budapest, and Salgótarján) were initiated by various motives, but the multi-stakeholder dialogues were facilitated through physical gamified tools in all cases. Based on the urban gamification hypothesis, the use of gamified tools supported certain steps of participatory processes in several aspects: it helped to increase the attractiveness of public events, to create a more informal atmosphere, to ensure equal conditions for actors, to recall a design mindset, to bridge contrasting social or cultural differences, to fix opinions and to assist dialogue between city actors, designers, and residents. This statement is confirmed by assessing the applied tools, analyzing the case studies, and comparing them to perceive their effects and interrelations.

Keywords: experimental projects, future planning, gamification, gamified tools, Hungary, public participation, UFLab, urban gamification

Procedia PDF Downloads 135
27148 Methods to Measure the Quality of 2D Image Compression Techniques

Authors: Mohammed H. Rasheed, Hussein Nadhem Fadhel, Mohammed M. Siddeq

Abstract:

In this paper we suggested image quality measuring metrics tools that can provide an accurate and close to the perceived quality sense of the tested images. Such tools give metrics that can be used to compare the performance of image compression algorithms. In this paper, two new metrics to measure the quality of decompressed images are proposed. The metric measurement based on combined data (CD) between an originals and decompressed images. Compared with other e.g., PSNR and RMSE, the proposed metrics gives values with the closest reflection of image quality perception by the human eye.

Keywords: RMSE, PSNR, image quality metrics, image compression

Procedia PDF Downloads 31
27147 Faculty Use of Geospatial Tools for Deep Learning in Science and Engineering Courses

Authors: Laura Rodriguez Amaya

Abstract:

Advances in science, technology, engineering, and mathematics (STEM) are viewed as important to countries’ national economies and their capacities to be competitive in the global economy. However, many countries experience low numbers of students entering these disciplines. To strengthen the professional STEM pipelines, it is important that students are retained in these disciplines at universities. Scholars agree that to retain students in universities’ STEM degrees, it is necessary that STEM course content shows the relevance of these academic fields to their daily lives. By increasing students’ understanding on the importance of these degrees and careers, students’ motivation to remain in these academic programs can also increase. An effective way to make STEM content relevant to students’ lives is the use of geospatial technologies and geovisualization in the classroom. The Geospatial Revolution, and the science and technology associated with it, has provided scientists and engineers with an incredible amount of data about Earth and Earth systems. This data can be used in the classroom to support instruction and make content relevant to all students. The purpose of this study was to find out the prevalence use of geospatial technologies and geovisualization as teaching practices in a USA university. The Teaching Practices Inventory survey, which is a modified version of the Carl Wieman Science Education Initiative Teaching Practices Inventory, was selected for the study. Faculty in the STEM disciplines that participated in a summer learning institute at a 4-year university in the USA constituted the population selected for the study. One of the summer learning institute’s main purpose was to have an impact on the teaching of STEM courses, particularly the teaching of gateway courses taken by many STEM majors. The sample population for the study is 97.5 of the total number of summer learning institute participants. Basic descriptive statistics through the Statistical Package for the Social Sciences (SPSS) were performed to find out: 1) The percentage of faculty using geospatial technologies and geovisualization; 2) Did the faculty associated department impact their use of geospatial tools?; and 3) Did the number of years in a teaching capacity impact their use of geospatial tools? Findings indicate that only 10 percent of respondents had used geospatial technologies, and 18 percent had used geospatial visualization. In addition, the use of geovisualization among faculty of different disciplines was broader than the use of geospatial technologies. The use of geospatial technologies concentrated in the engineering departments. Data seems to indicate the lack of incorporation of geospatial tools in STEM education. The use of geospatial tools is an effective way to engage students in deep STEM learning. Future research should look at the effect on student learning and retention in science and engineering programs when geospatial tools are used.

Keywords: engineering education, geospatial technology, geovisualization, STEM

Procedia PDF Downloads 249
27146 Normalized P-Laplacian: From Stochastic Game to Image Processing

Authors: Abderrahim Elmoataz

Abstract:

More and more contemporary applications involve data in the form of functions defined on irregular and topologically complicated domains (images, meshs, points clouds, networks, etc). Such data are not organized as familiar digital signals and images sampled on regular lattices. However, they can be conveniently represented as graphs where each vertex represents measured data and each edge represents a relationship (connectivity or certain affinities or interaction) between two vertices. Processing and analyzing these types of data is a major challenge for both image and machine learning communities. Hence, it is very important to transfer to graphs and networks many of the mathematical tools which were initially developed on usual Euclidean spaces and proven to be efficient for many inverse problems and applications dealing with usual image and signal domains. Historically, the main tools for the study of graphs or networks come from combinatorial and graph theory. In recent years there has been an increasing interest in the investigation of one of the major mathematical tools for signal and image analysis, which are Partial Differential Equations (PDEs) variational methods on graphs. The normalized p-laplacian operator has been recently introduced to model a stochastic game called tug-of-war-game with noise. Part interest of this class of operators arises from the fact that it includes, as particular case, the infinity Laplacian, the mean curvature operator and the traditionnal Laplacian operators which was extensiveley used to models and to solve problems in image processing. The purpose of this paper is to introduce and to study a new class of normalized p-Laplacian on graphs. The introduction is based on the extension of p-harmonious function introduced in as discrete approximation for both infinity Laplacian and p-Laplacian equations. Finally, we propose to use these operators as a framework for solving many inverse problems in image processing.

Keywords: normalized p-laplacian, image processing, stochastic game, inverse problems

Procedia PDF Downloads 509
27145 Systematic Review of Misconceptions: Tools for Diagnostics and Remediation Models for Misconceptions in Physics

Authors: Muhammad Iqbal, Edi Istiyono

Abstract:

Misconceptions are one of the problems in physics learning where students' understanding is not in line with scientific theory. The aim of this research is to find diagnostic tools to identify misconceptions and how to remediate physics misconceptions. In this research, the articles that will be reviewed come from the Scopus database related to physics misconceptions from 2013-2023. The articles obtained from the Scopus database were then selected according to the Prisma model, so 29 articles were obtained that focused on discussing physics misconceptions, especially regarding diagnostic tools and remediation methods. Currently, the most widely used diagnostic tool is the four-tier test, which is able to measure students' misconceptions in depth by knowing whether students are guessing or not and from then on, there is also a trend toward five-tier diagnostic tests with additional sources of information obtained. So that the origin of students' misconceptions is known. There are several ways to remediate student misconceptions, namely 11 ways and one of the methods used is digital practicum so that abstract things can be visualized into real ones. This research is limited to knowing what tools are used to diagnose and remediate misconceptions, so it is not yet known how big the effect of remediation methods is on misconceptions. The researcher recommends that in the future further research can be carried out to find out the most appropriate remediation method for remediating student misconceptions.

Keywords: misconception, remediation, systematic review, tools

Procedia PDF Downloads 31
27144 Assessment of Procurement-Demand of Milk Plant Using Quality Control Tools: A Case Study

Authors: Jagdeep Singh, Prem Singh

Abstract:

Milk is considered as an essential and complete food. The present study was conducted at Milk Plant Mohali especially in reference to the procurement section where the cash inflow was maximum, with the objective to achieve higher productivity and reduce wastage of milk. In milk plant it was observed that during the month of Jan-2014 to March-2014 the average procurement of milk was Rs. 4, 19, 361 liter per month and cost of procurement of milk is Rs 35/- per liter. The total cost of procurement thereby equal to Rs. 1crore 46 lakh per month, but there was mismatch in procurement-production of milk, which leads to an average loss of Rs. 12, 94, 405 per month. To solve the procurement-production problem Quality Control Tools like brainstorming, Flow Chart, Cause effect diagram and Pareto analysis are applied wherever applicable. With the successful implementation of Quality Control tools an average saving of Rs. 4, 59, 445 per month is done.

Keywords: milk, procurement-demand, quality control tools,

Procedia PDF Downloads 526
27143 Automating Test Activities: Test Cases Creation, Test Execution, and Test Reporting with Multiple Test Automation Tools

Authors: Loke Mun Sei

Abstract:

Software testing has become a mandatory process in assuring the software product quality. Hence, test management is needed in order to manage the test activities conducted in the software test life cycle. This paper discusses on the challenges faced in the software test life cycle, and how the test processes and test activities, mainly on test cases creation, test execution, and test reporting is being managed and automated using several test automation tools, i.e. Jira, Robot Framework, and Jenkins.

Keywords: test automation tools, test case, test execution, test reporting

Procedia PDF Downloads 577
27142 Uncertainty Quantification of Corrosion Anomaly Length of Oil and Gas Steel Pipelines Based on Inline Inspection and Field Data

Authors: Tammeen Siraj, Wenxing Zhou, Terry Huang, Mohammad Al-Amin

Abstract:

The high resolution inline inspection (ILI) tool is used extensively in the pipeline industry to identify, locate, and measure metal-loss corrosion anomalies on buried oil and gas steel pipelines. Corrosion anomalies may occur singly (i.e. individual anomalies) or as clusters (i.e. a colony of corrosion anomalies). Although the ILI technology has advanced immensely, there are measurement errors associated with the sizes of corrosion anomalies reported by ILI tools due limitations of the tools and associated sizing algorithms, and detection threshold of the tools (i.e. the minimum detectable feature dimension). Quantifying the measurement error in the ILI data is crucial for corrosion management and developing maintenance strategies that satisfy the safety and economic constraints. Studies on the measurement error associated with the length of the corrosion anomalies (in the longitudinal direction of the pipeline) has been scarcely reported in the literature and will be investigated in the present study. Limitations in the ILI tool and clustering process can sometimes cause clustering error, which is defined as the error introduced during the clustering process by including or excluding a single or group of anomalies in or from a cluster. Clustering error has been found to be one of the biggest contributory factors for relatively high uncertainties associated with ILI reported anomaly length. As such, this study focuses on developing a consistent and comprehensive framework to quantify the measurement errors in the ILI-reported anomaly length by comparing the ILI data and corresponding field measurements for individual and clustered corrosion anomalies. The analysis carried out in this study is based on the ILI and field measurement data for a set of anomalies collected from two segments of a buried natural gas pipeline currently in service in Alberta, Canada. Data analyses showed that the measurement error associated with the ILI-reported length of the anomalies without clustering error, denoted as Type I anomalies is markedly less than that for anomalies with clustering error, denoted as Type II anomalies. A methodology employing data mining techniques is further proposed to classify the Type I and Type II anomalies based on the ILI-reported corrosion anomaly information.

Keywords: clustered corrosion anomaly, corrosion anomaly assessment, corrosion anomaly length, individual corrosion anomaly, metal-loss corrosion, oil and gas steel pipeline

Procedia PDF Downloads 306
27141 The Impact of Using Technology Tools on Preparing English Language Learners for the 21st Century

Authors: Ozlem Kaya

Abstract:

21st-century learners are energetic and tech-savvy, and the skills and the knowledge required in this century are complex and challenging. Therefore, teachers need to find new ways to appeal to the needs and interests of their students and meet the demands of the 21st century at the same time. One way to do so in English language learning has been to incorporate various technology tools into classroom practices. Although teachers think these practices are effective and their students enjoy them, students may have different perceptions. To find out what students think about the use of technology tools in terms of developing 21st-century skills and knowledge, this study was conducted at Anadolu University School of Foreign Languages. A questionnaire was administered to 40 students at elementary level. Afterward, semi-structured interviews were held with 8 students to provide deeper insight into their perceptions. The details of the findings of the study will be presented and discussed during the presentation.

Keywords: 21st century skills, technology tools, perception, English Language Learning

Procedia PDF Downloads 293
27140 Drawing, Design and Building Information Modelling (BIM): Embedding Advanced Digital Tools in the Academy Programs for Building Engineers and Architects

Authors: Vittorio Caffi, Maria Pignataro, Antonio Cosimo Devito, Marco Pesenti

Abstract:

This paper deals with the integration of advanced digital design and modelling tools and methodologies, known as Building Information Modelling, into the traditional Academy educational programs for building engineers and architects. Nowadays, the challenge the Academy has to face is to present the new tools and their features to the pupils, making sure they acquire the proper skills in order to leverage the potential they offer also for the other courses embedded in the educational curriculum. The syllabus here presented refers to the “Drawing for building engineering”, “2D and 3D laboratory” and “3D modelling” curricula of the MSc in Building Engineering of the Politecnico di Milano. Such topics, included since the first year in the MSc program, are fundamental to give the students the instruments to master the complexity of an architectural or building engineering project with digital tools, so as to represent it in its various forms.

Keywords: BIM, BIM curricula, computational design, digital modelling

Procedia PDF Downloads 665
27139 Generic Data Warehousing for Consumer Electronics Retail Industry

Authors: S. Habte, K. Ouazzane, P. Patel, S. Patel

Abstract:

The dynamic and highly competitive nature of the consumer electronics retail industry means that businesses in this industry are experiencing different decision making challenges in relation to pricing, inventory control, consumer satisfaction and product offerings. To overcome the challenges facing retailers and create opportunities, we propose a generic data warehousing solution which can be applied to a wide range of consumer electronics retailers with a minimum configuration. The solution includes a dimensional data model, a template SQL script, a high level architectural descriptions, ETL tool developed using C#, a set of APIs, and data access tools. It has been successfully applied by ASK Outlets Ltd UK resulting in improved productivity and enhanced sales growth.

Keywords: consumer electronics, data warehousing, dimensional data model, generic, retail industry

Procedia PDF Downloads 407
27138 Analysis of Tools for Revitalization and Rehabilitation of Brownfields

Authors: Jiří Kugl

Abstract:

Typology and specific opportunities of brownfield revitalization are already largely described. Challenges and opportunities that brownfields represent have been adequately studied and presented, as well as specific ways in which these areas can be used or how they are used abroad. In other words, the questions why (revitalize brownfields) and what (we should do with them) are satisfactorily answered, but the question how (we can work with them) is not. This work will focus on answering this question, which will deal with tools that enable the revitalization and rehabilitation projects in the area. Tools can be divided, for example in terms of spatial planning and urban design, from an environmental perspective, from the perspective of cultural heritage protection and from the perspective of investment opportunities. The result is that the issue of brownfields is handled by numerous institutions and instruments. The aim of this paper is to identify, classify and analyze these instruments. Paper will study instruments from other countries with long-term experience with this issue (eg. France, Great Britain, USA, Germany, Denmark, Czech Republic) and analyse their contribution and the feasibility of their implementation in other countries.

Keywords: brownfields, revitalization, rehabilitation, tools, urban planning

Procedia PDF Downloads 237
27137 Optimization of Manufacturing Process Parameters: An Empirical Study from Taiwan's Tech Companies

Authors: Chao-Ton Su, Li-Fei Chen

Abstract:

The parameter design is crucial to improving the uniformity of a product or process. In the product design stage, parameter design aims to determine the optimal settings for the parameters of each element in the system, thereby minimizing the functional deviations of the product. In the process design stage, parameter design aims to determine the operating settings of the manufacturing processes so that non-uniformity in manufacturing processes can be minimized. The parameter design, trying to minimize the influence of noise on the manufacturing system, plays an important role in the high-tech companies. Taiwan has many well-known high-tech companies, which show key roles in the global economy. Quality remains the most important factor that enables these companies to sustain their competitive advantage. In Taiwan however, many high-tech companies face various quality problems. A common challenge is related to root causes and defect patterns. In the R&D stage, root causes are often unknown, and defect patterns are difficult to classify. Additionally, data collection is not easy. Even when high-volume data can be collected, data interpretation is difficult. To overcome these challenges, high-tech companies in Taiwan use more advanced quality improvement tools. In addition to traditional statistical methods and quality tools, the new trend is the application of powerful tools, such as neural network, fuzzy theory, data mining, industrial engineering, operations research, and innovation skills. In this study, several examples of optimizing the parameter settings for the manufacturing process in Taiwan’s tech companies will be presented to illustrate proposed approach’s effectiveness. Finally, a discussion of using traditional experimental design versus the proposed approach for process optimization will be made.

Keywords: quality engineering, parameter design, neural network, genetic algorithm, experimental design

Procedia PDF Downloads 143
27136 Spatial Information and Urbanizing Futures

Authors: Mohammad Talei, Neda Ranjbar Nosheri, Reza Kazemi Gorzadini

Abstract:

Today municipalities are searching for the new tools for increasing the public participation in different levels of urban planning. This approach of urban planning involves the community in planning process using participatory approaches instead of the long traditional top-down planning methods. These tools can be used to obtain the particular problems of urban furniture form the residents’ point of view. One of the tools that is designed with this goal is public participation GIS (PPGIS) that enables citizen to record and following up their feeling and spatial knowledge regarding main problems of the city, specifically urban furniture, in the form of maps. However, despite the good intentions of PPGIS, its practical implementation in developing countries faces many problems including the lack of basic supporting infrastructure and services and unavailability of sophisticated public participatory models. In this research we develop a PPGIS using of Web 2 to collect voluntary geodataand to perform spatial analysis based on Spatial OnLine Analytical Processing (SOLAP) and Spatial Data Mining (SDM). These tools provide urban planners with proper informationregarding the type, spatial distribution and the clusters of reported problems. This system is implemented in a case study area in Tehran, Iran and the challenges to make it applicable and its potential for real urban planning have been evaluated. It helps decision makers to better understand, plan and allocate scarce resources for providing most requested urban furniture.

Keywords: PPGIS, spatial information, urbanizing futures, urban planning

Procedia PDF Downloads 723
27135 Overview of the CRM Market in Tunisia

Authors: Mohamed Amine Bouraoui

Abstract:

The aim of this paper is to realize the importance of a CRM approach, to detect the degree of awareness of Tunisian managers of this importance and analyse the degree of integration of CRM in the Tunisian companies. Initially, we focus on the definition and components of CRM, then we focus on the level of integration of CRM within Tunisian enterprises.

Keywords: CRM, operational tools, analytical tools, Tunisian company

Procedia PDF Downloads 416
27134 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data

Authors: Adarsh Shroff

Abstract:

Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.

Keywords: big data, map reduce, incremental processing, iterative computation

Procedia PDF Downloads 345
27133 Multidirectional Product Support System for Decision Making in Textile Industry Using Collaborative Filtering Methods

Authors: A. Senthil Kumar, V. Murali Bhaskaran

Abstract:

In the information technology ground, people are using various tools and software for their official use and personal reasons. Nowadays, people are worrying to choose data accessing and extraction tools at the time of buying and selling their products. In addition, worry about various quality factors such as price, durability, color, size, and availability of the product. The main purpose of the research study is to find solutions to these unsolved existing problems. The proposed algorithm is a Multidirectional Rank Prediction (MDRP) decision making algorithm in order to take an effective strategic decision at all the levels of data extraction, uses a real time textile dataset and analyzes the results. Finally, the results are obtained and compared with the existing measurement methods such as PCC, SLCF, and VSS. The result accuracy is higher than the existing rank prediction methods.

Keywords: Knowledge Discovery in Database (KDD), Multidirectional Rank Prediction (MDRP), Pearson’s Correlation Coefficient (PCC), VSS (Vector Space Similarity)

Procedia PDF Downloads 283
27132 Survey on Arabic Sentiment Analysis in Twitter

Authors: Sarah O. Alhumoud, Mawaheb I. Altuwaijri, Tarfa M. Albuhairi, Wejdan M. Alohaideb

Abstract:

Large-scale data stream analysis has become one of the important business and research priorities lately. Social networks like Twitter and other micro-blogging platforms hold an enormous amount of data that is large in volume, velocity and variety. Extracting valuable information and trends out of these data would aid in a better understanding and decision-making. Multiple analysis techniques are deployed for English content. Moreover, one of the languages that produce a large amount of data over social networks and is least analyzed is the Arabic language. The proposed paper is a survey on the research efforts to analyze the Arabic content in Twitter focusing on the tools and methods used to extract the sentiments for the Arabic content on Twitter.

Keywords: big data, social networks, sentiment analysis, twitter

Procedia PDF Downloads 573