Search results for: hydrology tools
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3909

Search results for: hydrology tools

3669 Next-Viz: A Literature Review and Web-Based Visualization Tool Proposal

Authors: Railly Hugo, Igor Aguilar-Alonso

Abstract:

Software visualization is a powerful tool for understanding complex software systems. However, current visualization tools often lack features or are difficult to use, limiting their effectiveness. In this paper, we present next-viz, a proposed web-based visualization tool that addresses these challenges. We provide a literature review of existing software visualization techniques and tools and describe the architecture of next-viz in detail. Our proposed tool incorporates state-of-the-art visualization techniques and is designed to be user-friendly and intuitive. We believe next-viz has the potential to advance the field of software visualization significantly.

Keywords: software visualization, literature review, tool proposal, next-viz, web-based, architecture, visualization techniques, user-friendly, intuitive

Procedia PDF Downloads 60
3668 Disrupted or Discounted Cash Flow: Impact of Digitisation on Business Valuation

Authors: Matthias Haerri, Tobias Huettche, Clemens Kustner

Abstract:

This article discusses the impact of digitization on business valuation. In order to become and remain ‘digital’, investments are necessary whose return on investment (ROI) often remains vague. This uncertainty is contradictory for a valuation, that rely on predictable cash flows, fixed capital structures and the steady state. However digitisation does not make a company valuation impossible, but traditional approaches must be reconsidered. The authors identify four areas that are to be changing: (1) Tools instead of intuition - In the future, company valuation will neither be art nor science, but craft. This does not require intuition, but experience and good tools. Digital evaluation tools beyond Excel will therefore gain in importance. (2) Real-time instead of deadline - At present, company valuations are always carried out on a case-by-case basis and on a specific key date. This will change with the digitalization and the introduction of web-based valuation tools. Company valuations can thus not only be carried out faster and more efficiently, but can also be offered more frequently. Instead of calculating the value for a previous key date, current and real-time valuations can be carried out. (3) Predictive planning instead of analysis of the past - Past data will also be needed in the future, but its use will not be limited to monovalent time series or key figure analyses. With pictures of ‘black swans’ and the ‘turkey illusion’ it was made clear to us that we build forecasts on too few data points of the past and underestimate the power of chance. Predictive planning can help here. (4) Convergence instead of residual value - Digital transformation shortens the lifespan of viable business models. If companies want to live forever, they have to change forever. For the company valuation, this means that the business model valid on the valuation date only has a limited service life.

Keywords: business valuation, corporate finance, digitisation, disruption

Procedia PDF Downloads 105
3667 The Use of Network Tool for Brain Signal Data Analysis: A Case Study with Blind and Sighted Individuals

Authors: Cleiton Pons Ferreira, Diana Francisca Adamatti

Abstract:

Advancements in computers technology have allowed to obtain information for research in biology and neuroscience. In order to transform the data from these surveys, networks have long been used to represent important biological processes, changing the use of this tools from purely illustrative and didactic to more analytic, even including interaction analysis and hypothesis formulation. Many studies have involved this application, but not directly for interpretation of data obtained from brain functions, asking for new perspectives of development in neuroinformatics using existent models of tools already disseminated by the bioinformatics. This study includes an analysis of neurological data through electroencephalogram (EEG) signals, using the Cytoscape, an open source software tool for visualizing complex networks in biological databases. The data were obtained from a comparative case study developed in a research from the University of Rio Grande (FURG), using the EEG signals from a Brain Computer Interface (BCI) with 32 eletrodes prepared in the brain of a blind and a sighted individuals during the execution of an activity that stimulated the spatial ability. This study intends to present results that lead to better ways for use and adapt techniques that support the data treatment of brain signals for elevate the understanding and learning in neuroscience.

Keywords: neuroinformatics, bioinformatics, network tools, brain mapping

Procedia PDF Downloads 136
3666 A Tool to Measure the Usability Guidelines for Arab E-Government Websites

Authors: Omyma Alosaimi, Asma Alsumait

Abstract:

The website developer and designer should follow usability guidelines to provide a user-friendly interface. Using tools to measure usability, the evaluator can evaluate automatically hundreds of links within few minutes. It has the advantage of detecting some violations that only machines can detect. For that using usability evaluating tool is important to find as many violations as possible. There are many websites usability testing tools, but none is developed to measure the usability of e-government website nor Arabic e-government websites. To measure the usability of the Arabic e-government websites, a tool is developed and tested in this paper. A comparison of using a tool specifically developed for e-government websites and general usability testing tool is presented.

Keywords: e-government, human computer interaction, usability evaluation, usability guidelines

Procedia PDF Downloads 396
3665 Applications of Analytical Probabilistic Approach in Urban Stormwater Modeling in New Zealand

Authors: Asaad Y. Shamseldin

Abstract:

Analytical probabilistic approach is an innovative approach for urban stormwater modeling. It can provide information about the long-term performance of a stormwater management facility without being computationally very demanding. This paper explores the application of the analytical probabilistic approach in New Zealand. The paper presents the results of a case study aimed at development of an objective way of identifying what constitutes a rainfall storm event and the estimation of the corresponding statistical properties of storms using two selected automatic rainfall stations located in the Auckland region in New Zealand. The storm identification and the estimation of the storm statistical properties are regarded as the first step in the development of the analytical probabilistic models. The paper provides a recommendation about the definition of the storm inter-event time to be used in conjunction with the analytical probabilistic approach.

Keywords: hydrology, rainfall storm, storm inter-event time, New Zealand, stormwater management

Procedia PDF Downloads 309
3664 Assessment of Time-variant Work Stress for Human Error Prevention

Authors: Hyeon-Kyo Lim, Tong-Il Jang, Yong-Hee Lee

Abstract:

For an operator in a nuclear power plant, human error is one of the most dreaded factors that may result in unexpected accidents. The possibility of human errors may be low, but the risk of them would be unimaginably enormous. Thus, for accident prevention, it is quite indispensable to analyze the influence of any factors which may raise the possibility of human errors. During the past decades, not a few research results showed that performance of human operators may vary over time due to lots of factors. Among them, stress is known to be an indirect factor that may cause human errors and result in mental illness. Until now, not a few assessment tools have been developed to assess stress level of human workers. However, it still is questionable to utilize them for human performance anticipation which is related with human error possibility, because they were mainly developed from the viewpoint of mental health rather than industrial safety. Stress level of a person may go up or down with work time. In that sense, if they would be applicable in the safety aspect, they should be able to assess the variation resulted from work time at least. Therefore, this study aimed to compare their applicability for safety purpose. More than 10 kinds of work stress tools were analyzed with reference to assessment items, assessment and analysis methods, and follow-up measures which are known to close related factors with work stress. The results showed that most tools mainly focused their weights on some common organizational factors such as demands, supports, and relationships, in sequence. Their weights were broadly similar. However, they failed to recommend practical solutions. Instead, they merely advised to set up overall counterplans in PDCA cycle or risk management activities which would be far from practical human error prevention. Thus, it was concluded that application of stress assessment tools mainly developed for mental health seemed to be impractical for safety purpose with respect to human performance anticipation, and that development of a new assessment tools would be inevitable if anyone wants to assess stress level in the aspect of human performance variation and accident prevention. As a consequence, as practical counterplans, this study proposed a new scheme for assessment of work stress level of a human operator that may vary over work time which is closely related with the possibility of human errors.

Keywords: human error, human performance, work stress, assessment tool, time-variant, accident prevention

Procedia PDF Downloads 647
3663 Regionalization of IDF Curves, by Interpolating Intensity and Adjustment Parameters - Application to Boyacá, Colombia

Authors: Pedro Mauricio Acosta, Carlos Andrés Caro

Abstract:

This research presents the regionalization of IDF curves for the department of Boyacá, Colombia, which comprises 16 towns, including the provincial capital, Tunja. For regionalization adjustment parameters (U and alpha) of the IDF curves stations referred to in the studied area were used. Similar regionalization is used by the interpolation of intensities. In the case of regionalization by parameters found by the construction of the curves intensity, duration and frequency estimation methods using ordinary moments and maximum likelihood. Regionalization and interpolation of data were performed with the assistance of Arcgis software. Within the development of the project the best choice to provide a level of reliability such as to determine which of the options and ways to regionalize is best sought. The resulting isolines maps were made in the case of regionalization intensities, each map is associated with a different return period and duration in order to build IDF curves in the studied area. In the case of the regionalization maps parameters associated with each parameter were performed last.

Keywords: intensity duration, frequency curves, regionalization, hydrology

Procedia PDF Downloads 306
3662 An Agile, Intelligent and Scalable Framework for Global Software Development

Authors: Raja Asad Zaheer, Aisha Tanveer, Hafza Mehreen Fatima

Abstract:

Global Software Development (GSD) is becoming a common norm in software industry, despite of the fact that global distribution of the teams presents special issues for effective communication and coordination of the teams. Now trends are changing and project management for distributed teams is no longer in a limbo. GSD can be effectively established using agile and project managers can use different agile techniques/tools for solving the problems associated with distributed teams. Agile methodologies like scrum and XP have been successfully used with distributed teams. We have employed exploratory research method to analyze different recent studies related to challenges of GSD and their proposed solutions. In our study, we had deep insight in six commonly faced challenges: communication and coordination, temporal differences, cultural differences, knowledge sharing/group awareness, speed and communication tools. We have established that each of these challenges cannot be neglected for distributed teams of any kind. They are interlinked and as an aggregated whole can cause the failure of projects. In this paper we have focused on creating a scalable framework for detecting and overcoming these commonly faced challenges. In the proposed solution, our objective is to suggest agile techniques/tools relevant to a particular problem faced by the organizations related to the management of distributed teams. We focused mainly on scrum and XP techniques/tools because they are widely accepted and used in the industry. Our solution identifies the problem and suggests an appropriate technique/tool to help solve the problem based on globally shared knowledgebase. We can establish a cause and effect relationship using a fishbone diagram based on the inputs provided for issues commonly faced by organizations. Based on the identified cause, suitable tool is suggested, our framework suggests a suitable tool. Hence, a scalable, extensible, self-learning, intelligent framework proposed will help implement and assess GSD to achieve maximum out of it. Globally shared knowledgebase will help new organizations to easily adapt best practices set forth by the practicing organizations.

Keywords: agile project management, agile tools/techniques, distributed teams, global software development

Procedia PDF Downloads 272
3661 Analyses of Reference Evapotranspiration in West of Iran under Climate Change

Authors: Saeed Jahanbakhsh Asl, Yaghob Dinpazhoh, Masoumeh Foroughi

Abstract:

Reference evapotranspiration (ET₀) is an important element in the water cycle that integrates atmospheric demands and surface conditions, and analysis of changes in ET₀ is of great significance for understanding climate change and its impacts on hydrology. As ET₀ is an integrated effect of climate variables, increases in air temperature should lead to increases in ET₀. ET₀ estimated by using the globally accepted Food and Agriculture Organization (FAO) Penman-Monteith (FAO-56 PM) method in 18 meteorological stations located in the West of Iran. The trends of ET₀ detected by using the Mann-Kendall (MK) test. The slopes of the trend lines were computed by using the Sen’s slope estimator. The results showed significant increasing as well as decreasing trends in the annual and monthly ET₀. However, ET₀ trends were increasing. In the monthly scale, the number of the increasing trends was more than the number of decreasing trends, in the majority of warm months of the year.

Keywords: climate change, Mann–Kendall, Penman-Monteith method (FAO-56 PM), reference crop evapotranspiration

Procedia PDF Downloads 260
3660 Application of Facilities Management Practice in High Rise Commercial Properties: Jos in Perpective

Authors: Aliyu Ahmad Aliyu, Abubakar Ahmad, Muhammad Umar Bello, Rozilah Kasim, David Martin

Abstract:

The article studied the application of facilities management practice in high rise commercial properties. Convenience sampling technique was used in administering questionnaires to the 60 respondents who responded to the survey. It was found out that the extent of application of facilities management in the subject properties is better described as below average. Similarly, the most frequently tools of facilities management in use and employed in the properties were outsourcing and in-house sourcing. This was influenced by the level of their familiarity with the tools. Planned and Preventive maintenance should be taken regularly in other to enhance the effectiveness of the facilities management and to satisfy both the owner and customers of the organization.

Keywords: commercial properties, facilities management, high-rise buildings, Jos metropolis and outsourcing

Procedia PDF Downloads 505
3659 Optimal Resource Configuration and Allocation Planning Problem for Bottleneck Machines and Auxiliary Tools

Authors: Yin-Yann Chen, Tzu-Ling Chen

Abstract:

This study presents the case of an actual Taiwanese semiconductor assembly and testing manufacturer. Three major bottleneck manufacturing processes, namely, die bond, wire bond, and molding, are analyzed to determine how to use finite resources to achieve the optimal capacity allocation. A medium-term capacity allocation planning model is developed by considering the optimal total profit to satisfy the promised volume demanded by customers and to obtain the best migration decision among production lines for machines and tools. Finally, sensitivity analysis based on the actual case is provided to explore the effect of various parameter levels.

Keywords: capacity planning, capacity allocation, machine migration, resource configuration

Procedia PDF Downloads 433
3658 Use of Nutritional Screening Tools in Cancer-Associated Malnutrition

Authors: Meryem Saban Guler, Saniye Bilici

Abstract:

Malnutrition is a problem that significantly affects patients with cancer throughout the course of their illness, and it may be present from the moment of diagnosis until the end of treatment. We searched electronic databases using key terms such as ‘malnutrition in cancer patients’ or ‘nutritional status in cancer’ or ‘nutritional screening tools’ etc. Decline in nutritional status and continuing weight loss are associated with an increase in number and severity of complications, impaired quality of life and decreased survival rate. Nutrition is an important factor in the treatment and progression of cancer. Cancer patients are particularly susceptible to nutritional depletion due to the combined effects of the malignant disease and its treatment. With increasing incidence of cancer, identification and management of nutritional deficiencies are needed. Early identification of malnutrition, is substantial to minimize or prevent undesirable outcomes throughout clinical course. In determining the nutritional status; food consumption status, anthropometric methods, laboratory tests, clinical symptoms, psychosocial data are used. First-line strategies must include routine screening and identification of inpatients or outpatients at nutritional risk with the use of a simple and standardized screening tool. There is agreement among international nutrition organizations and accredited health care organizations that routine nutritional screening should be a standard procedure for every patient admitted to a hospital. There are f management of all cancer patients therefore routine nutritional screening with validated tools can identify cancer patients at risk.

Keywords: cancer, malnutrition, nutrition, nutritional screening

Procedia PDF Downloads 171
3657 The Effect of Artificial Intelligence on Marketing Distribution

Authors: Yousef Wageh Nagy Fahmy

Abstract:

Mobile phones are one of the direct marketing tools used to reach today's hard-to-reach consumers. Cell phones are very personal devices and you can have them with you anytime, anywhere. This offers marketers the opportunity to create personalized marketing messages and send them at the right time and place. The study examined consumer attitudes towards mobile marketing, particularly SMS marketing. Unlike similar studies, this study does not focus on young people, but includes consumers between the ages of 18 and 70 in the field study.The results showed that the majority of participants found SMS marketing disruptive. The biggest problems with SMS marketing are subscribing to message lists without the recipient's consent; large number of messages sent; and the irrelevance of message content

Keywords: direct marketing, mobile phones mobile marketing, sms advertising, marketing sponsorship, marketing communication theories, marketing communication tools

Procedia PDF Downloads 40
3656 Didactical and Semiotic Affordance of GeoGebra in a Productive Mathematical Discourse

Authors: Isaac Benning

Abstract:

Using technology to expand the learning space is critical for a productive mathematical discourse. This is a case study of two teachers who developed and enacted GeoGebra-based mathematics lessons following their engagement in a two-year professional development. The didactical and semiotic affordance of GeoGebra in widening the learning space for a productive mathematical discourse was explored. The approach of thematic analysis was used for lesson artefact, lesson observation, and interview data. The results indicated that constructing tools in GeoGebra provided a didactical milieu where students used them to explore mathematical concepts with little or no support from their teacher. The prompt feedback from the GeoGebra motivated students to practice mathematical concepts repeatedly in which they privately rethink their solutions before comparing their answers with that of their colleagues. The constructing tools enhanced self-discovery, team spirit, and dialogue among students. With regards to the semiotic construct, the tools widened the physical and psychological atmosphere of the classroom by providing animations that served as virtual concrete to enhance the recording, manipulation, testing of a mathematical idea, construction, and interpretation of geometric objects. These findings advance the discussion of widening the classroom for a productive mathematical discourse within the context of the mathematics curriculum of Ghana and similar Sub-Saharan African countries.

Keywords: GeoGebra, theory of didactical situation, semiotic mediation, mathematics laboratory, mathematical discussion

Procedia PDF Downloads 98
3655 Detect Critical Thinking Skill in Written Text Analysis. The Use of Artificial Intelligence in Text Analysis vs Chat/Gpt

Authors: Lucilla Crosta, Anthony Edwards

Abstract:

Companies and the market place nowadays struggle to find employees with adequate skills in relation to anticipated growth of their businesses. At least half of workers will need to undertake some form of up-skilling process in the next five years in order to remain aligned with the requests of the market . In order to meet these challenges, there is a clear need to explore the potential uses of AI (artificial Intelligence) based tools in assessing transversal skills (critical thinking, communication and soft skills of different types in general) of workers and adult students while empowering them to develop those same skills in a reliable trustworthy way. Companies seek workers with key transversal skills that can make a difference between workers now and in the future. However, critical thinking seems to be the one of the most imprtant skill, bringing unexplored ideas and company growth in business contexts. What employers have been reporting since years now, is that this skill is lacking in the majority of workers and adult students, and this is particularly visible trough their writing. This paper investigates how critical thinking and communication skills are currently developed in Higher Education environments through use of AI tools at postgraduate levels. It analyses the use of a branch of AI namely Machine Learning and Big Data and of Neural Network Analysis. It also examines the potential effect the acquisition of these skills through AI tools and what kind of effects this has on employability This paper will draw information from researchers and studies both at national (Italy & UK) and international level in Higher Education. The issues associated with the development and use of one specific AI tool Edulai, will be examined in details. Finally comparisons will be also made between these tools and the more recent phenomenon of Chat GPT and forthcomings and drawbacks will be analysed.

Keywords: critical thinking, artificial intelligence, higher education, soft skills, chat GPT

Procedia PDF Downloads 76
3654 Property of Diamond Coated Tools for Lapping Single-Crystal Sapphire Wafer

Authors: Feng Wei, Lu Wenzhuang, Cai Wenjun, Yu Yaping, Basnet Rabin, Zuo Dunwen

Abstract:

Diamond coatings were prepared on cemented carbide by hot filament chemical vapor deposition (HFCVD) method. Lapping experiment of single-crystal sapphire wafer was carried out using the prepared diamond coated tools. The diamond coatings and machined surface of the sapphire wafer were evaluated by SEM, laser confocal microscope and Raman spectrum. The results indicate that the lapping sapphire chips are small irregular debris and long thread-like debris. There is graphitization of diamond crystal during the lapping process. A low surface roughness can be obtained using a spherical grain diamond coated tool.

Keywords: lapping, nano-micro crystalline diamond coating, Raman spectrum, sapphire

Procedia PDF Downloads 464
3653 Development of Academic Software for Medial Axis Determination of Porous Media from High-Resolution X-Ray Microtomography Data

Authors: S. Jurado, E. Pazmino

Abstract:

Determination of the medial axis of a porous media sample is a non-trivial problem of interest for several disciplines, e.g., hydrology, fluid dynamics, contaminant transport, filtration, oil extraction, etc. However, the computational tools available for researchers are limited and restricted. The primary aim of this work was to develop a series of algorithms to extract porosity, medial axis structure, and pore-throat size distributions from porous media domains. A complementary objective was to provide the algorithms as free computational software available to the academic community comprising researchers and students interested in 3D data processing. The burn algorithm was tested on porous media data obtained from High-Resolution X-Ray Microtomography (HRXMT) and idealized computer-generated domains. The real data and idealized domains were discretized in voxels domains of 550³ elements and binarized to denote solid and void regions to determine porosity. Subsequently, the algorithm identifies the layer of void voxels next to the solid boundaries. An iterative process removes or 'burns' void voxels in sequence of layer by layer until all the void space is characterized. Multiples strategies were tested to optimize the execution time and use of computer memory, i.e., segmentation of the overall domain in subdomains, vectorization of operations, and extraction of single burn layer data during the iterative process. The medial axis determination was conducted identifying regions where burnt layers collide. The final medial axis structure was refined to avoid concave-grain effects and utilized to determine the pore throat size distribution. A graphic user interface software was developed to encompass all these algorithms, including the generation of idealized porous media domains. The software allows input of HRXMT data to calculate porosity, medial axis, and pore-throat size distribution and provide output in tabular and graphical formats. Preliminary tests of the software developed during this study achieved medial axis, pore-throat size distribution and porosity determination of 100³, 320³ and 550³ voxel porous media domains in 2, 22, and 45 minutes, respectively in a personal computer (Intel i7 processor, 16Gb RAM). These results indicate that the software is a practical and accessible tool in postprocessing HRXMT data for the academic community.

Keywords: medial axis, pore-throat distribution, porosity, porous media

Procedia PDF Downloads 92
3652 Application of Systems Engineering Tools and Methods to Improve Healthcare Delivery Inside the Emergency Department of a Mid-Size Hospital

Authors: Mohamed Elshal, Hazim El-Mounayri, Omar El-Mounayri

Abstract:

Emergency department (ED) is considered as a complex system of interacting entities: patients, human resources, software and hardware systems, interfaces, and other systems. This paper represents a research for implementing a detailed Systems Engineering (SE) approach in a mid-size hospital in central Indiana. This methodology will be applied by “The Initiative for Product Lifecycle Innovation (IPLI)” institution at Indiana University to study and solve the crowding problem with the aim of increasing throughput of patients and enhance their treatment experience; therefore, the nature of crowding problem needs to be investigated with all other problems that leads to it. The presented SE methods are workflow analysis and systems modeling where SE tools such as Microsoft Visio are used to construct a group of system-level diagrams that demonstrate: patient’s workflow, documentation and communication flow, data systems, human resources workflow and requirements, leadership involved, and integration between ER different systems. Finally, the ultimate goal will be managing the process through implementation of an executable model using commercialized software tools, which will identify bottlenecks, improve documentation flow, and help make the process faster.

Keywords: systems modeling, ED operation, workflow modeling, systems analysis

Procedia PDF Downloads 154
3651 Application Potential of Selected Tools in Context of Critical Infrastructure Protection and Risk Analysis

Authors: Hromada Martin

Abstract:

Risk analysis is considered as a fundamental aspect relevant for ensuring the level of critical infrastructure protection, where the critical infrastructure is seen as system, asset or its part which is important for maintaining the vital societal functions. Article actually discusses and analyzes the potential application of selected tools of information support for the implementation and within the framework of risk analysis and critical infrastructure protection. Use of the information in relation to their risk analysis can be viewed as a form of simplifying the analytical process. It is clear that these instruments (information support) for these purposes are countless, so they were selected representatives who have already been applied in the selected area of critical infrastructure, or they can be used. All presented fact were the basis for critical infrastructure resilience evaluation methodology development.

Keywords: critical infrastructure, protection, resilience, risk analysis

Procedia PDF Downloads 605
3650 Evaluation and Assessment of Bioinformatics Methods and Their Applications

Authors: Fatemeh Nokhodchi Bonab

Abstract:

Bioinformatics, in its broad sense, involves application of computer processes to solve biological problems. A wide range of computational tools are needed to effectively and efficiently process large amounts of data being generated as a result of recent technological innovations in biology and medicine. A number of computational tools have been developed or adapted to deal with the experimental riches of complex and multivariate data and transition from data collection to information or knowledge. These bioinformatics tools are being evaluated and applied in various medical areas including early detection, risk assessment, classification, and prognosis of cancer. The goal of these efforts is to develop and identify bioinformatics methods with optimal sensitivity, specificity, and predictive capabilities. The recent flood of data from genome sequences and functional genomics has given rise to new field, bioinformatics, which combines elements of biology and computer science. Bioinformatics is conceptualizing biology in terms of macromolecules (in the sense of physical-chemistry) and then applying "informatics" techniques (derived from disciplines such as applied maths, computer science, and statistics) to understand and organize the information associated with these molecules, on a large-scale. Here we propose a definition for this new field and review some of the research that is being pursued, particularly in relation to transcriptional regulatory systems.

Keywords: methods, applications, transcriptional regulatory systems, techniques

Procedia PDF Downloads 97
3649 Measuring Ecological Footprint: Life Cycle Assessment Approach

Authors: Binita Shah, Seema Unnikrishnan

Abstract:

In the recent time, an increasing interest in the analysis and efforts to reduce the environmental impacts generated by man-made activities has been seen widely being discussed and implemented by the society. The industrial processes are expressing their concern and showing keen interest in redesigning and amending the operation process leading to better environmental performance by upgrading technologies and adjusting the financial inputs. There are various tools available for the assessment of process and production of goods on the environment. Most methods look at a particular impact on the ecosystem. Life Cycle Assessment (LCA) is one of the most widely accepted and scientifically founded methodologies to assess the overall environmental impacts of products and processes. This paper looks at the tools used in India for environmental impact assessment.

Keywords: life cycle assessment, ecological footprint, measuring sustainability, India

Procedia PDF Downloads 625
3648 Nonparametric Copula Approximations

Authors: Serge Provost, Yishan Zang

Abstract:

Copulas are currently utilized in finance, reliability theory, machine learning, signal processing, geodesy, hydrology and biostatistics, among several other fields of scientific investigation. It follows from Sklar's theorem that the joint distribution function of a multidimensional random vector can be expressed in terms of its associated copula and marginals. Since marginal distributions can easily be determined by making use of a variety of techniques, we address the problem of securing the distribution of the copula. This will be done by using several approaches. For example, we will obtain bivariate least-squares approximations of the empirical copulas, modify the kernel density estimation technique and propose a criterion for selecting appropriate bandwidths, differentiate linearized empirical copulas, secure Bernstein polynomial approximations of suitable degrees, and apply a corollary to Sklar's result. Illustrative examples involving actual observations will be presented. The proposed methodologies will as well be applied to a sample generated from a known copula distribution in order to validate their effectiveness.

Keywords: copulas, Bernstein polynomial approximation, least-squares polynomial approximation, kernel density estimation, density approximation

Procedia PDF Downloads 45
3647 A Pathway of Collaborative Platform to Assess the Sustainable University

Authors: S. K. Ashiquer Rahman

Abstract:

The paper concentrates on the importance of Sustainable Campus Strategies, emphasizing the significance of mobilizing Innovative technological tools for constructing effectiveness of higher education strategy and institutional cooperation for sustainable campus at the university level and preparing the university’s authority to face the upcoming higher education strategy and institutional cooperation difficulties to the Sustainable Campus Plan. Within a framework of Sustainable Campus Strategies and institutional cooperation, the paper discusses the significance of a set of reference points that will lead to operational activities for multi-stakeholder multi-criteria evaluation of Higher Education and Research Institutions relative to the Sustainable Campus criteria and potential action plan for the University’s Strategy and Institutional Cooperation. It makes mention of the emergence of the effectiveness of Higher Education Strategy and Institutional Cooperation as well as the necessity of mobilizing innovative technological methods and tools for constructing the effectiveness of this Process. The paper outlines the conceptual framing of a Sustainable Campus Strategy, Institutional Cooperation and Action Plan for a sustainable campus. Optimistically, these will be a milestone in higher education, a pathway to meet the imminent Sustainable Campus Strategy and Institutional Cooperation of the completive world, and be able to manage the required criteria for a Sustainable University.

Keywords: higher education strategy, institutional cooperation, sustainable campus, multi-criteria evaluation, innovative method and tools

Procedia PDF Downloads 52
3646 From User's Requirements to UML Class Diagram

Authors: Zeineb Ben Azzouz, Wahiba Ben Abdessalem Karaa

Abstract:

The automated extraction of UML class diagram from natural language requirements is a highly challenging task. Many approaches, frameworks and tools have been presented in this field. Nonetheless, the experiments of these tools have shown that there is no approach that can work best all the time. In this context, we propose a new accurate approach to facilitate the automatic mapping from textual requirements to UML class diagram. Our new approach integrates the best properties of statistical Natural Language Processing (NLP) techniques to reduce ambiguity when analysing natural language requirements text. In addition, our approach follows the best practices defined by conceptual modelling experts to determine some patterns indispensable for the extraction of basic elements and concepts of the class diagram. Once the relevant information of class diagram is captured, a XMI document is generated and imported with a CASE tool to build the corresponding UML class diagram.

Keywords: class diagram, user’s requirements, XMI, software engineering

Procedia PDF Downloads 446
3645 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution

Authors: Najrullah Khan, Athar Ali Khan

Abstract:

The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.

Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation

Procedia PDF Downloads 503
3644 Influence of Thermal Damage on the Mechanical Strength of Trimmed CFRP

Authors: Guillaume Mullier, Jean François Chatelain

Abstract:

Carbon Fiber Reinforced Plastics (CFRPs) are widely used for advanced applications, in particular in aerospace, automotive and wind energy industries. Once cured to near net shape, CFRP parts need several finishing operations such as trimming, milling or drilling in order to accommodate fastening hardware and meeting the final dimensions. The present research aims to study the effect of the cutting temperature in trimming on the mechanical strength of high performance CFRP laminates used for aeronautics applications. The cutting temperature is of great importance when dealing with trimming of CFRP. Temperatures higher than the glass-transition temperature (Tg) of the resin matrix are highly undesirable: they cause degradation of the matrix in the trimmed edges area, which can severely affect the mechanical performance of the entire component. In this study, a 9.50 mm diameter CVD diamond coated carbide tool with six flutes was used to trim 24-plies CFRP laminates. A 300 m/min cutting speed and 1140 mm/min feed rate were used in the experiments. The tool was heated prior to trimming using a blowtorch, for temperatures ranging from 20°C to 300°C. The temperature at the cutting edge was measured using embedded K-Type thermocouples. Samples trimmed for different cutting temperatures, below and above Tg, were mechanically tested using three-points bending short-beam loading configurations. New cutting tools as well as worn cutting tools were utilized for the experiments. The experiments with the new tools could not prove any correlation between the length of cut, the cutting temperature and the mechanical performance. Thus mechanical strength was constant, regardless of the cutting temperature. However, for worn tools, producing a cutting temperature rising up to 450°C, thermal damage of the resin was observed. The mechanical tests showed a reduced mean resistance in short beam configuration, while the resistance in three point bending decreases with increase of the cutting temperature.

Keywords: composites, trimming, thermal damage, surface quality

Procedia PDF Downloads 305
3643 Modeling Sediment Yield Using the SWAT Model: A Case Study of Upper Ankara River Basin, Turkey

Authors: Umit Duru

Abstract:

The Soil and Water Assessment Tool (SWAT) was tested for prediction of water balance and sediment yield in the Ankara gauged basin, Turkey. The overall objective of this study was to evaluate the performance and applicability of the SWAT in this region of Turkey. Thirteen years of monthly stream flow, and suspended sediment, data were used for calibration and validation. This research assessed model performance based on differences between observed and predicted suspended sediment yield during calibration (1987-1996) and validation (1982-1984) periods. Statistical comparisons of suspended sediment produced values for NSE (Nash Sutcliffe efficiency), RE (relative error), and R² (coefficient of determination), of 0.81, -1.55, and 0.93, respectively, during the calibration period, and NSE, RE (%), and R² of 0.77, -2.61, and 0.87, respectively, during the validation period. Based on the analyses, SWAT satisfactorily simulated observed hydrology and sediment yields and can be used as a tool in decision making for water resources planning and management in the basin.

Keywords: calibration, GIS, sediment yield, SWAT, validation

Procedia PDF Downloads 253
3642 Metallic-Diamond Tools with Increased Abrasive Wear Resistance for Grinding Industrial Floor Systems

Authors: Elżbieta Cygan, Bączek, Piotr Wyżga

Abstract:

This paper presents the results of research on the physical, mechanical, and tribological properties of materials constituting the matrix in sintered metallic-diamond tools. The ground powders based on the Fe-Mn-Cu-Sn-C system were modified with micro-sized particles of the ceramic phase: SiC, Al₂O₃ and consolidated using the SPS (spark plasma sintering) method to a relative density of over 98% at 850-950°C, at a pressure of 35 MPa and time 10 min. After sintering, an analysis of the microstructure was conducted using scanning electron microscopy. The resulting materials were tested for the apparent density determined by Archimedes’ method, Rockwell hardness (scale B), Young’s modulus, as well as for technological properties. The performance results of obtained diamond composites were compared with the base material (Fe–Mn–Cu–Sn–C) and the commercial alloy Co-20% WC. The hardness of composites has achieved the maximum at a temperature of 900°C; therefore, it should be considered that at this temperature it was obtained optimal physical and mechanical properties of the subjects' composites were. Research on tribological properties showed that the composites modified with micro-sized particles of the ceramic phase are characterized by more than twice higher wear resistance in comparison with base materials and the commercial alloy Co-20% WC. Composites containing Al₂O₃ phase particles in the matrix material were composites containing Al₂O₃ phase particles in the matrix material were characterized by the lowest abrasion wear resistance. The manufacturing technology presented in the paper is economically justified and can be successfully used in the production process of the matrix in sintered diamond-impregnated tools used for the machining of an industrial floor system. Acknowledgment: The study was performed under LIDER IX Research Project No. LIDER/22/0085/L-9/17/NCBR/2018 entitled “Innovative metal-diamond tools without the addition of critical raw materials for applications in the process of grinding industrial floor systems” funded by the National Centre for Research and Development of Poland, Warsaw.

Keywords: abrasive wear resistance, metal matrix composites, sintered diamond tools, Spark Plasma Sintering

Procedia PDF Downloads 48
3641 Faculty Use of Geospatial Tools for Deep Learning in Science and Engineering Courses

Authors: Laura Rodriguez Amaya

Abstract:

Advances in science, technology, engineering, and mathematics (STEM) are viewed as important to countries’ national economies and their capacities to be competitive in the global economy. However, many countries experience low numbers of students entering these disciplines. To strengthen the professional STEM pipelines, it is important that students are retained in these disciplines at universities. Scholars agree that to retain students in universities’ STEM degrees, it is necessary that STEM course content shows the relevance of these academic fields to their daily lives. By increasing students’ understanding on the importance of these degrees and careers, students’ motivation to remain in these academic programs can also increase. An effective way to make STEM content relevant to students’ lives is the use of geospatial technologies and geovisualization in the classroom. The Geospatial Revolution, and the science and technology associated with it, has provided scientists and engineers with an incredible amount of data about Earth and Earth systems. This data can be used in the classroom to support instruction and make content relevant to all students. The purpose of this study was to find out the prevalence use of geospatial technologies and geovisualization as teaching practices in a USA university. The Teaching Practices Inventory survey, which is a modified version of the Carl Wieman Science Education Initiative Teaching Practices Inventory, was selected for the study. Faculty in the STEM disciplines that participated in a summer learning institute at a 4-year university in the USA constituted the population selected for the study. One of the summer learning institute’s main purpose was to have an impact on the teaching of STEM courses, particularly the teaching of gateway courses taken by many STEM majors. The sample population for the study is 97.5 of the total number of summer learning institute participants. Basic descriptive statistics through the Statistical Package for the Social Sciences (SPSS) were performed to find out: 1) The percentage of faculty using geospatial technologies and geovisualization; 2) Did the faculty associated department impact their use of geospatial tools?; and 3) Did the number of years in a teaching capacity impact their use of geospatial tools? Findings indicate that only 10 percent of respondents had used geospatial technologies, and 18 percent had used geospatial visualization. In addition, the use of geovisualization among faculty of different disciplines was broader than the use of geospatial technologies. The use of geospatial technologies concentrated in the engineering departments. Data seems to indicate the lack of incorporation of geospatial tools in STEM education. The use of geospatial tools is an effective way to engage students in deep STEM learning. Future research should look at the effect on student learning and retention in science and engineering programs when geospatial tools are used.

Keywords: engineering education, geospatial technology, geovisualization, STEM

Procedia PDF Downloads 226
3640 Teaching Tools for Web Processing Services

Authors: Rashid Javed, Hardy Lehmkuehler, Franz Josef-Behr

Abstract:

Web Processing Services (WPS) have up growing concern in geoinformation research. However, teaching about them is difficult because of the generally complex circumstances of their use. They limit the possibilities for hands- on- exercises on Web Processing Services. To support understanding however a Training Tools Collection was brought on the way at University of Applied Sciences Stuttgart (HFT). It is limited to the scope of Geostatistical Interpolation of sample point data where different algorithms can be used like IDW, Nearest Neighbor etc. The Tools Collection aims to support understanding of the scope, definition and deployment of Web Processing Services. For example it is necessary to characterize the input of Interpolation by the data set, the parameters for the algorithm and the interpolation results (here a grid of interpolated values is assumed). This paper reports on first experiences using a pilot installation. This was intended to find suitable software interfaces for later full implementations and conclude on potential user interface characteristics. Experiences were made with Deegree software, one of several Services Suites (Collections). Being strictly programmed in Java, Deegree offers several OGC compliant Service Implementations that also promise to be of benefit for the project. The mentioned parameters for a WPS were formalized following the paradigm that any meaningful component will be defined in terms of suitable standards. E.g. the data output can be defined as a GML file. But, the choice of meaningful information pieces and user interactions is not free but partially determined by the selected WPS Processing Suite.

Keywords: deegree, interpolation, IDW, web processing service (WPS)

Procedia PDF Downloads 332