Search results for: reference management software
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15220

Search results for: reference management software

14650 An Argument for Agile, Lean, and Hybrid Project Management in Museum Conservation Practice: A Qualitative Evaluation of the Morris Collection Conservation Project at the Sainsbury Centre for Visual Arts

Authors: Maria Ledinskaya

Abstract:

This paper is part case study and part literature review. It seeks to introduce Agile, Lean, and Hybrid project management concepts from business, software development, and manufacturing fields to museum conservation by looking at their practical application on a recent conservation project at the Sainsbury Centre for Visual Arts. The author outlines the advantages of leaner and more agile conservation practices in today’s faster, less certain, and more budget-conscious museum climate where traditional project structures are no longer as relevant or effective. The Morris Collection Conservation Project was carried out in 2019-2021 in Norwich, UK, and concerned the remedial conservation of around 150 Abstract Constructivist artworks bequeathed to the Sainsbury Centre by private collectors Michael and Joyce Morris. It was a medium-sized conservation project of moderate complexity, planned and delivered in an environment with multiple known unknowns – unresearched collection, unknown conditions and materials, unconfirmed budget. The project was later impacted by the COVID-19 pandemic, introducing indeterminate lockdowns, budget cuts, staff changes, and the need to accommodate social distancing and remote communications. The author, then a staff conservator at the Sainsbury Centre who acted as project manager on the Morris Project, presents an incremental, iterative, and value-based approach to managing a conservation project in an uncertain environment. The paper examines the project from the point of view of Traditional, Agile, Lean, and Hybrid project management. The author argues that most academic writing on project management in conservation has focussed on a Traditional plan-driven approach – also known as Waterfall project management – which has significant drawbacks in today’s museum environment due to its over-reliance on prediction-based planning and its low tolerance to change. In the last 20 years, alternative Agile, Lean and Hybrid approaches to project management have been widely adopted in software development, manufacturing, and other industries, although their recognition in the museum sector has been slow. Using examples from the Morris Project, the author introduces key principles and tools of Agile, Lean, and Hybrid project management and presents a series of arguments on the effectiveness of these alternative methodologies in museum conservation, including the ethical and practical challenges to their implementation. These project management approaches are discussed in the context of consequentialist, relativist, and utilitarian developments in contemporary conservation ethics. Although not intentionally planned as such, the Morris Project had a number of Agile and Lean features which were instrumental to its successful delivery. These key features are identified as distributed decision-making, a co-located cross-disciplinary team, servant leadership, focus on value-added work, flexible planning done in shorter sprint cycles, light documentation, and emphasis on reducing procedural, financial, and logistical waste. Overall, the author’s findings point in favour of a hybrid model, which combines traditional and alternative project processes and tools to suit the specific needs of the project.

Keywords: agile project management, conservation, hybrid project management, lean project management, waterfall project management

Procedia PDF Downloads 57
14649 Finding out the Best Criteria for Locating the Best Place Resettling of Victims after the Earthquake: A Case Study for Tehran, Iran

Authors: Reyhaneh Saeedi

Abstract:

Iran is a capable zone for the earthquake that follows the loss of lives and financial damages. To have sheltering for earthquake victims is one of the basic requirements although it is hard to select suitable places for temporary resettling after an earthquake happens. Before these kinds of disasters happen, the best places for resettling the victims must be designated. This matter is an important issue in disaster management and planning. Geospatial Information System(GIS) has a determining role in disaster management, it can determine the best places for temporary resettling after such a disaster. In this paper, the best criteria have been determined associated with their weights and buffers by use of research and questionnaire for locating the best places. In this paper, AHP method is used as decision model and to locate the best places for temporary resettling is done based on the selected criteria. Also, in this research are made the buffer layers of criteria and change them to the raster layers. Later on, the raster layers are multiplied on desired weights then, the results are added together. Finally, there are suitable places for resettling of victims by desired criteria by different colors with their optimum rate in ArcGIS software.

Keywords: disaster management, temporary resettlement, earthquake, criteria

Procedia PDF Downloads 277
14648 Empirical Investigation of Barriers to Industrial Energy Conservation Measures in the Manufacturing Small and Medium Enterprises (SME's) of Pakistan

Authors: Muhammad Tahir Hassan, Stas Burek, Muhammad Asif, Mohamed Emad

Abstract:

Industrial sector in Pakistan accounts for 25% of total energy consumption in the country. The performance of this sector has been severely affected due to the adverse effect of current energy crises in the country. Energy conservation potentials of Pakistan’s industrial sectors through energy management can save wasted energy which would ultimately leads to economic and environmental benefits. However due to lack of financial incentives of energy efficiency and absence of energy benchmarking within same industrial sectors are some of the main challenges in the implementation of energy management. In Pakistan, this area has not been adequately explored, and there is a lack of focus on the need for industrial energy efficiency and proper management. The main objective of this research is to evaluate the current energy management performance of Pakistani industrial sector and empirical investigation of the existence of various barriers to industrial energy efficiency. Data was collected from the respondents of 192 small and medium-sized enterprises (SME’s) of Pakistan i.e. foundries, textile, plastic industries, light engineering, auto and spare parts and ceramic manufacturers and analysed using Statistical Package for the Social Sciences (SPSS) software. Current energy management performance of manufacturing SME’s in Pakistan has been evaluated by employing two significant indicators, ‘Energy Management Matrix’ and ‘pay-off criteria’, with modified approach. Using the energy management matrix, energy management profiles of overall industry and the individual sectors have been drawn to assess the energy management performance and identify the weak and strong areas as well. Results reveal that, energy management practices in overall surveyed industries are at very low level. Energy management profiles drawn against each sector suggest that performance of textile sector is better among all the surveyed manufacturing SME’s. The empirical barriers to industrial energy efficiency have also been ranked according to the overall responses. The results further reveal that there is a significant relationship exists among the industrial size, sector type and nature of barriers to industrial energy efficiency for the manufacturing SME’s in Pakistan. The findings of this study may help the industries and policy makers in Pakistan to formulate a sustainable energy policy to support industrial energy efficiency keeping in view the actual existing energy efficiency scenario in the industrial sector.

Keywords: barriers, energy conservation, energy management profile, environment, manufacturing SME's of Pakistan

Procedia PDF Downloads 275
14647 Motor Controller Implementation Using Model Based Design

Authors: Cau Tran, Tu Nguyen, Tien Pham

Abstract:

Model-based design (MBD) is a mathematical and visual technique for addressing design issues in the fields of communications, signal processing, and complicated control systems. It is utilized in several automotive, aerospace, industrial, and motion control applications. Virtual models are at the center of the software development process with model based design. A method used in the creation of embedded software is model-based design. In this study, the LAT motor is modeled in a simulation environment, and the LAT motor control is designed with a cascade structure, a speed and current control loop, and a controller that is used in the next part. A PID structure serves as this controller. Based on techniques and motor parameters that match the design goals, the PID controller is created for the model using traditional design principles. The MBD approach will be used to build embedded software for motor control. The paper will be divided into three distinct sections. The first section will introduce the design process and the benefits and drawbacks of the MBD technique. The design of control software for LAT motors will be the main topic of the next section. The experiment's results are the subject of the last section.

Keywords: model based design, limited angle torque, intellectual property core, hardware description language, controller area network, user datagram protocol

Procedia PDF Downloads 85
14646 Investigating the Effective Factors on Product Performance and Prioritizing Them: Case Study of Pars-Khazar Company

Authors: Ebrahim Sabermaash Eshghi, Donna Sandsmark

Abstract:

Nowadays, successful companies try to create a reliable and unique competitive position in the market. It is important to consider that only choosing and codifying a competitive strategy appropriate with the market conditions does not have any influence on the final performance of the company by itself, but it is the connection and interaction between upstream level strategies and functional level strategies which leads to development of company performance in its operating environment. Given the importance of the subject, this study tries to investigate effective factors on product performance and prioritize them. This study was done with quantitative-qualitative approach (interview and questionnaire). In sum, 103 informed managers and experts of Pars-Khazar Company were investigated in a census. Validity of measure tools was approved through experts’ judgments. Reliability of the tools was also gained through Cronbach's Alpha Coefficient as 0.930 and in sum, validity and reliability of the tools was approved generally. Analysis of collected data was done through Spearman Correlation Test and Friedman Test using SPSS software. The results showed that management of distribution and demand process (0.675), management of Product Pre-test (0.636) and Manufacturing and inventory management(0.628) had the highest correlation with product performance. Prioritization of factors of structure of launching new products based on the average showed that management of volume of launched products and Manufacturing and inventory management had the most importance.

Keywords: product performance, home appliances, market, case study

Procedia PDF Downloads 209
14645 A One-Dimensional Modeling Analysis of the Influence of Swirl and Tumble Coefficient in a Single-Cylinder Research Engine

Authors: Mateus Silva Mendonça, Wender Pereira de Oliveira, Gabriel Heleno de Paula Araújo, Hiago Tenório Teixeira Santana Rocha, Augusto César Teixeira Malaquias, José Guilherme Coelho Baeta

Abstract:

The stricter legislation and the greater demand of the population regard to gas emissions and their effects on the environment as well as on human health make the automotive industry reinforce research focused on reducing levels of contamination. This reduction can be achieved through the implementation of improvements in internal combustion engines in such a way that they promote the reduction of both specific fuel consumption and air pollutant emissions. These improvements can be obtained through numerical simulation, which is a technique that works together with experimental tests. The aim of this paper is to build, with support of the GT-Suite software, a one-dimensional model of a single-cylinder research engine to analyze the impact of the variation of swirl and tumble coefficients on the performance and on the air pollutant emissions of an engine. Initially, the discharge coefficient is calculated through the software Converge CFD 3D, given that it is an input parameter in GT-Power. Mesh sensitivity tests are made in 3D geometry built for this purpose, using the mass flow rate in the valve as a reference. In the one-dimensional simulation is adopted the non-predictive combustion model called Three Pressure Analysis (TPA) is, and then data such as mass trapped in cylinder, heat release rate, and accumulated released energy are calculated, aiming that the validation can be performed by comparing these data with those obtained experimentally. Finally, the swirl and tumble coefficients are introduced in their corresponding objects so that their influences can be observed when compared to the results obtained previously.

Keywords: 1D simulation, single-cylinder research engine, swirl coefficient, three pressure analysis, tumble coefficient

Procedia PDF Downloads 85
14644 Developing an Information Model of Manufacturing Process for Sustainability

Authors: Jae Hyun Lee

Abstract:

Manufacturing companies use life-cycle inventory databases to analyze sustainability of their manufacturing processes. Life cycle inventory data provides reference data which may not be accurate for a specific company. Collecting accurate data of manufacturing processes for a specific company requires enormous time and efforts. An information model of typical manufacturing processes can reduce time and efforts to get appropriate reference data for a specific company. This paper shows an attempt to build an abstract information model which can be used to develop information models for specific manufacturing processes.

Keywords: process information model, sustainability, OWL, manufacturing

Procedia PDF Downloads 415
14643 An Informative Marketing Platform: Methodology and Architecture

Authors: Martina Marinelli, Samanta Vellante, Francesco Pilotti, Daniele Di Valerio, Gaetanino Paolone

Abstract:

Any development in web marketing technology requires changes in information engineering to identify instruments and techniques suitable for the production of software applications for informative marketing. Moreover, for large web solutions, designing an interface that enables human interactions is a complex process that must bridge between informative marketing requirements and the developed solution. A user-friendly interface in web marketing applications is crucial for a successful business. The paper introduces mkInfo - a software platform that implements informative marketing. Informative marketing is a new interpretation of marketing which places the information at the center of every marketing action. The creative team includes software engineering researchers who have recently authored an article on automatic code generation. The authors have created the mkInfo software platform to generate informative marketing web applications. For each web application, it is possible to automatically implement an opt in page, a landing page, a sales page, and a thank you page: one only needs to insert the content. mkInfo implements an autoresponder to send mail according to a predetermined schedule. The mkInfo platform also includes e-commerce for a product or service. The stakeholder can access any opt-in page and get basic information about a product or service. If he wants to know more, he will need to provide an e-mail address to access a landing page that will generate an e-mail sequence. It will provide him with complete information about the product or the service. From this point on, the stakeholder becomes a user and is now able to purchase the product or related services through the mkInfo platform. This paper suggests a possible definition for Informative Marketing, illustrates its basic principles, and finally details the mkInfo platform that implements it. This paper also offers some Informative Marketing models, which are implemented in the mkInfo platform. Informative marketing can be applied to products or services. It is necessary to realize a web application for each product or service. The mkInfo platform enables the product or the service producer to send information concerning a specific product or service to all stakeholders. In conclusion, the technical contributions of this paper are: a different interpretation of marketing based on information; a modular architecture for web applications, particularly for one with standard features such as information storage, exchange, and delivery; multiple models to implement informative marketing; a software platform enabling the implementation of such models in a web application. Future research aims to enable stakeholders to provide information about a product or a service so that the information gathered about a product or a service includes both the producer’s and the stakeholders' point of view. The purpose is to create an all-inclusive management system of the knowledge regarding a specific product or service: a system that includes everything about the product or service and is able to address even unexpected questions.

Keywords: informative marketing, opt in page, software platform, web application

Procedia PDF Downloads 112
14642 Comparative Evaluation of Equity Indicators in the Matikiw Community-Based Forest Management Project in Pakil, Laguna and the Minayutan and Bacong Sigsigan Community-Based Forest Management Project in Famy, Laguna

Authors: Katherine Arquio

Abstract:

Community-based Forest Management (CBFM) is one of the integrative programs that slowly turned the course of forest management from traditional corporate to community-based practice resulting to people empowerment. As such, one of its goals is to promote socio-economic welfare among the people in the community in which social equity is included. This study aims to look at the equity aspect of the program, particularly if there are equity differences between two CBFM sites- Matikiw in Pakil, Laguna and Minayutan and Bacong Sigsigan in Famy, Laguna. Equity indicators were identified first, since these will be the basis of the questions that will be asked on the survey, after this, the survey proper was conducted, and finally, the analysis. Two tailed t-test was used as statistical tool since the difference between the two sites is the focus of the study. Statistical analysis was done through the use of STATA program, a statistical software. There were 32 indicators identified and results showed that, out of these indicators, only 13 were found significantly different between the two. The 13 indicators were significantly observed only in Matikiw; the other 19 indicators were commonly observed in both areas and are conducive as equity indicators for the CBFM program.

Keywords: social equity, CBFM, social forestry, equity indicators

Procedia PDF Downloads 356
14641 Design and Optimization of Soil Nailing Construction

Authors: Fereshteh Akbari, Farrokh Jalali Mosalam, Ali Hedayatifar, Amirreza Aminjavaheri

Abstract:

The soil nailing is an effective method to stabilize slopes and retaining structures. Consequently, the lateral and vertical displacement of retaining walls are important criteria to evaluate the safety risks of adjacent structures. This paper is devoted to the optimization problems of retaining walls based on ABAQOUS Software. The various parameters such as nail length, orientation, arrangement, horizontal spacing, and bond skin friction, on lateral and vertical displacement of retaining walls are investigated. In order to ensure accuracy, the mobilized shear stress acting around the perimeter of the nail-soil interface is also modeled in ABAQOUS software. The observed trend of results is compared to the previous researches.

Keywords: retaining walls, soil nailing, ABAQOUS software, lateral displacement, vertical displacement

Procedia PDF Downloads 109
14640 Conduction Transfer Functions for the Calculation of Heat Demands in Heavyweight Facade Systems

Authors: Mergim Gasia, Bojan Milovanovica, Sanjin Gumbarevic

Abstract:

Better energy performance of the building envelope is one of the most important aspects of energy savings if the goals set by the European Union are to be achieved in the future. Dynamic heat transfer simulations are being used for the calculation of building energy consumption because they give more realistic energy demands compared to the stationary calculations that do not take the building’s thermal mass into account. Software used for these dynamic simulation use methods that are based on the analytical models since numerical models are insufficient for longer periods. The analytical models used in this research fall in the category of the conduction transfer functions (CTFs). Two methods for calculating the CTFs covered by this research are the Laplace method and the State-Space method. The literature review showed that the main disadvantage of these methods is that they are inadequate for heavyweight façade elements and shorter time periods used for the calculation. The algorithms for both the Laplace and State-Space methods are implemented in Mathematica, and the results are compared to the results from EnergyPlus and TRNSYS since these software use similar algorithms for the calculation of the building’s energy demand. This research aims to check the efficiency of the Laplace and the State-Space method for calculating the building’s energy demand for heavyweight building elements and shorter sampling time, and it also gives the means for the improvement of the algorithms used by these methods. As the reference point for the boundary heat flux density, the finite difference method (FDM) is used. Even though the dynamic heat transfer simulations are superior to the calculation based on the stationary boundary conditions, they have their limitations and will give unsatisfactory results if not properly used.

Keywords: Laplace method, state-space method, conduction transfer functions, finite difference method

Procedia PDF Downloads 112
14639 Integrating Building Information Modeling into Facilities Management Operations

Authors: Mojtaba Valinejadshoubi, Azin Shakibabarough, Ashutosh Bagchi

Abstract:

Facilities such as residential buildings, office buildings, and hospitals house large density of occupants. Therefore, a low-cost facility management program (FMP) should be used to provide a satisfactory built environment for these occupants. Facility management (FM) has been recently used in building projects as a critical task. It has been effective in reducing operation and maintenance cost of these facilities. Issues of information integration and visualization capabilities are critical for reducing the complexity and cost of FM. Building information modeling (BIM) can be used as a strong visual modeling tool and database in FM. The main objective of this study is to examine the applicability of BIM in the FM process during a building’s operational phase. For this purpose, a seven-storey office building is modeled Autodesk Revit software. Authors integrated the cloud-based environment using a visual programming tool, Dynamo, for the purpose of having a real-time cloud-based communication between the facility managers and the participants involved in the project. An appropriate and effective integrated data source and visual model such as BIM can reduce a building’s operational and maintenance costs by managing the building life cycle properly.

Keywords: building information modeling, facility management, operational phase, building life cycle

Procedia PDF Downloads 137
14638 Predicting Daily Patient Hospital Visits Using Machine Learning

Authors: Shreya Goyal

Abstract:

The study aims to build user-friendly software to understand patient arrival patterns and compute the number of potential patients who will visit a particular health facility for a given period by using a machine learning algorithm. The underlying machine learning algorithm used in this study is the Support Vector Machine (SVM). Accurate prediction of patient arrival allows hospitals to operate more effectively, providing timely and efficient care while optimizing resources and improving patient experience. It allows for better allocation of staff, equipment, and other resources. If there's a projected surge in patients, additional staff or resources can be allocated to handle the influx, preventing bottlenecks or delays in care. Understanding patient arrival patterns can also help streamline processes to minimize waiting times for patients and ensure timely access to care for patients in need. Another big advantage of using this software is adhering to strict data protection regulations such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States as the hospital will not have to share the data with any third party or upload it to the cloud because the software can read data locally from the machine. The data needs to be arranged in. a particular format and the software will be able to read the data and provide meaningful output. Using software that operates locally can facilitate compliance with these regulations by minimizing data exposure. Keeping patient data within the hospital's local systems reduces the risk of unauthorized access or breaches associated with transmitting data over networks or storing it in external servers. This can help maintain the confidentiality and integrity of sensitive patient information. Historical patient data is used in this study. The input variables used to train the model include patient age, time of day, day of the week, seasonal variations, and local events. The algorithm uses a Supervised learning method to optimize the objective function and find the global minima. The algorithm stores the values of the local minima after each iteration and at the end compares all the local minima to find the global minima. The strength of this study is the transfer function used to calculate the number of patients. The model has an output accuracy of >95%. The method proposed in this study could be used for better management planning of personnel and medical resources.

Keywords: machine learning, SVM, HIPAA, data

Procedia PDF Downloads 51
14637 Facial Recognition of University Entrance Exam Candidates using FaceMatch Software in Iran

Authors: Mahshid Arabi

Abstract:

In recent years, remarkable advancements in the fields of artificial intelligence and machine learning have led to the development of facial recognition technologies. These technologies are now employed in a wide range of applications, including security, surveillance, healthcare, and education. In the field of education, the identification of university entrance exam candidates has been one of the fundamental challenges. Traditional methods such as using ID cards and handwritten signatures are not only inefficient and prone to fraud but also susceptible to errors. In this context, utilizing advanced technologies like facial recognition can be an effective and efficient solution to increase the accuracy and reliability of identity verification in entrance exams. This article examines the use of FaceMatch software for recognizing the faces of university entrance exam candidates in Iran. The main objective of this research is to evaluate the efficiency and accuracy of FaceMatch software in identifying university entrance exam candidates to prevent fraud and ensure the authenticity of individuals' identities. Additionally, this research investigates the advantages and challenges of using this technology in Iran's educational systems. This research was conducted using an experimental method and random sampling. In this study, 1000 university entrance exam candidates in Iran were selected as samples. The facial images of these candidates were processed and analyzed using FaceMatch software. The software's accuracy and efficiency were evaluated using various metrics, including accuracy rate, error rate, and processing time. The research results indicated that FaceMatch software could accurately identify candidates with a precision of 98.5%. The software's error rate was less than 1.5%, demonstrating its high efficiency in facial recognition. Additionally, the average processing time for each candidate's image was less than 2 seconds, indicating the software's high efficiency. Statistical evaluation of the results using precise statistical tests, including analysis of variance (ANOVA) and t-test, showed that the observed differences were significant, and the software's accuracy in identity verification is high. The findings of this research suggest that FaceMatch software can be effectively used as a tool for identifying university entrance exam candidates in Iran. This technology not only enhances security and prevents fraud but also simplifies and streamlines the exam administration process. However, challenges such as preserving candidates' privacy and the costs of implementation must also be considered. The use of facial recognition technology with FaceMatch software in Iran's educational systems can be an effective solution for preventing fraud and ensuring the authenticity of university entrance exam candidates' identities. Given the promising results of this research, it is recommended that this technology be more widely implemented and utilized in the country's educational systems.

Keywords: facial recognition, FaceMatch software, Iran, university entrance exam

Procedia PDF Downloads 23
14636 Spatially Referenced Checklist Model Dedicated to Professional Actors for a Good Evaluation and Management of Networks

Authors: Abdessalam Hijab, Hafida Boulekbache, Eric Henry

Abstract:

The objective of this article is to explain the use of geographic information system (GIS) and information and communication technologies (ICTs) in the real-time processing and analysis of data on the status of an urban sanitation network by integrating professional actors in sanitation for sustainable management in urban areas. Indeed, it is a smart geo-collaboration based on the complementarity of ICTs and GIS. This multi-actor reflection was built with the objective of contributing to the development of complementary solutions to the existing technologies to better protect the urban environment, with the help of a checklist with the spatial reference "E-Géo-LD" dedicated to the "professional/professional" actors in sanitation, for intelligent monitoring of liquid sanitation networks in urban areas. In addition, this research provides a good understanding and assimilation of liquid sanitation schemes in the "Lamkansa" sampling area of the city of Casablanca, and spatially evaluates these schemes. Downstream, it represents a guide to assess the environmental impacts of the liquid sanitation scheme.

Keywords: ICT, GIS, spatial checklist, liquid sanitation, environment

Procedia PDF Downloads 210
14635 Evaluation of the Architect-Friendliness of LCA-Based Environmental Impact Assessment Tools

Authors: Elke Meex, Elke Knapen, Griet Verbeeck

Abstract:

The focus of sustainable building is gradually shifting from energy efficiency towards the more global environmental impact of building design during all life-cycle stages. In this context, many tools have been developed that use a LCA-approach to assess the environmental impact on a whole building level. Since the building design strongly influences the final environmental performance and the architect plays a key role in the design process, it is important that these tools are adapted to his work method and support the decision making from the early design phase on. Therefore, a comparative evaluation of the degree of architect-friendliness of some LCA tools on building level is made, based on an evaluation framework specifically developed for the architect’s viewpoint. In order to allow comparison of the results, a reference building has been designed, documented for different design phases and entered in all software tools. The evaluation according to the framework shows that the existing tools are not very architect-friendly. Suggestions for improvement are formulated.

Keywords: architect-friendliness, design supportive value, evaluation framework, tool comparison

Procedia PDF Downloads 524
14634 Usability Evaluation in Practice: Selecting the Appropriate Method

Authors: Hanan Hayat, Russell Lock

Abstract:

The importance of usability in ensuring software quality has been well established in literature and widely accepted by software development practitioners. Consequently, numerous usability evaluation methods have been developed. However, the availability of large variety of evaluation methods alongside insufficient studies that critically analyse them resulted in an ambiguous process of selection amongst non-usability-expert practitioners. This study investigates the factors affecting the selection of usability evaluation methods within a project by interviewing a software development team. The results of the data gathered are then analysed and integrated in developing a framework. The framework developed poses a solution to the selection processes of usability evaluation methods by adjusting to individual projects resources and goals. It has the potential to be further evaluated to verify its applicability and usability within the domain of this study.

Keywords: usability evaluation, evaluating usability in non-user entered designs, usability evaluation methods (UEM), usability evaluation in projects

Procedia PDF Downloads 142
14633 The Analysis of Personalized Low-Dose Computed Tomography Protocol Based on Cumulative Effective Radiation Dose and Cumulative Organ Dose for Patients with Breast Cancer with Regular Chest Computed Tomography Follow up

Authors: Okhee Woo

Abstract:

Purpose: The aim of this study is to evaluate 2-year cumulative effective radiation dose and cumulative organ dose on regular follow-up computed tomography (CT) scans in patients with breast cancer and to establish personalized low-dose CT protocol. Methods and Materials: A retrospective study was performed on the patients with breast cancer who were diagnosed and managed consistently on the basis of routine breast cancer follow-up protocol between 2012-01 and 2016-06. Based on ICRP (International Commission on Radiological Protection) 103, the cumulative effective radiation doses of each patient for 2-year follow-up were analyzed using the commercial radiation management software (Radimetrics, Bayer healthcare). The personalized effective doses on each organ were analyzed in detail by the software-providing Monte Carlo simulation. Results: A total of 3822 CT scans on 490 patients was evaluated (age: 52.32±10.69). The mean scan number for each patient was 7.8±4.54. Each patient was exposed 95.54±63.24 mSv of radiation for 2 years. The cumulative CT radiation dose was significantly higher in patients with lymph node metastasis (p = 0.00). The HER-2 positive patients were more exposed to radiation compared to estrogen or progesterone receptor positive patient (p = 0.00). There was no difference in the cumulative effective radiation dose with different age groups. Conclusion: To acknowledge how much radiation exposed to a patient is a starting point of management of radiation exposure for patients with long-term CT follow-up. The precise and personalized protocol, as well as iterative reconstruction, may reduce hazard from unnecessary radiation exposure.

Keywords: computed tomography, breast cancer, effective radiation dose, cumulative organ dose

Procedia PDF Downloads 171
14632 Bayes Estimation of Parameters of Binomial Type Rayleigh Class Software Reliability Growth Model using Non-informative Priors

Authors: Rajesh Singh, Kailash Kale

Abstract:

In this paper, the Binomial process type occurrence of software failures is considered and failure intensity has been characterized by one parameter Rayleigh class Software Reliability Growth Model (SRGM). The proposed SRGM is mathematical function of parameters namely; total number of failures i.e. η-0 and scale parameter i.e. η-1. It is assumed that very little or no information is available about both these parameters and then considering non-informative priors for both these parameters, the Bayes estimators for the parameters η-0 and η-1 have been obtained under square error loss function. The proposed Bayes estimators are compared with their corresponding maximum likelihood estimators on the basis of risk efficiencies obtained by Monte Carlo simulation technique. It is concluded that both the proposed Bayes estimators of total number of failures and scale parameter perform well for proper choice of execution time.

Keywords: binomial process, non-informative prior, maximum likelihood estimator (MLE), rayleigh class, software reliability growth model (SRGM)

Procedia PDF Downloads 374
14631 Determination of the Gain in Learning the Free-Fall Motion of Bodies by Applying the Resource of Previous Concepts

Authors: Ricardo Merlo

Abstract:

In this paper, we analyzed the different didactic proposals for teaching about the free fall motion of bodies available online. An important aspect was the interpretation of the direction and sense of the acceleration of gravity and of the falling velocity of a body, which is why we found different applications of the Cartesian reference system used and also different graphical presentations of the velocity as a function of time and of the distance traveled vertically by the body in the period of time that it was dropped from a height h0. In this framework, a survey of previous concepts was applied to a voluntary group of first-year university students of an Engineering degree before and after the development of the class of the subject in question. Then, Hake's index (0.52) was determined, which resulted in an average learning gain from the meaningful use of the reference system and the respective graphs of v=ƒ (t) and h=ƒ (t).

Keywords: didactic gain, free–fall, physics teaching, previous knowledge

Procedia PDF Downloads 147
14630 Evaluation of Neuroprotective Potential of Olea europaea and Malus domestica in Experimentally Induced Stroke Rat Model

Authors: Humaira M. Khan, Kanwal Asif

Abstract:

Ischemic stroke is a neurological disorder with a complex pathophysiology associated with motor, sensory and cognitive deficits. Major approaches developed to treat acute ischemic stroke fall into two categories, thrombolysis and neuroprotection. The objectives of this study were to evaluate the neuroprotective and anti-thrombolytic effects of Olea europaea (olive oil) and Malus domestica (apple cider vinegar) and their combination in rat stroke model. Furthermore, histopathological analysis was also performed to assess the severity of ischemia among treated and reference groups. Male albino rats (12 months age) weighing 300- 350gm were acclimatized and subjected to middle cerebral artery occlusion method for stroke induction. Olea europaea and Malus domestica was administered orally in dose of 0.75ml/kg and 3ml/kg and combination was administered at dose of 0.375ml/kg and 1.5ml/kg prophylactically for consecutive 21 days. Negative control group was dosed with normal saline whereas piracetam (250mg/kg) was administered as reference. Neuroprotective activity of standard piracetam, Olea europaea, Malus domestica and their combination was evaluated by performing functional outcome tests i.e. Cylinder, pasta, ladder run, pole and water maize tests. Rats were subjected to surgery after 21 days of treatment for analysis from stroke recovery. Olea europaea and Malus domestica in individual doses of 0.75ml/kg and 3ml/kg respectively showed neuroprotection by significant improvement in ladder run test (121.6± 0.92;128.2 ± 0.73) as compare to reference (125.4 ± 0.74). Both test doses showed significant neuroprotection as compare to reference (9.60 ± 0.50) in pasta test (8.40 ± 0.24;9.80 ± 0.37) whereas with cylinder test, experimental groups showed significant increase in movements (6.60 ± 0.24; 8.40 ± 0.24) in contrast to reference (7.80 ± 0.37).There was a decrease in percentage time taken f to reach the hidden maize in water maize test (56.80 ± 0.58;61.80 ± 0.66) at doses 0.75ml/kg and 3ml/kg respectively as compare to piracetam (59.40 ± 1.07). Olea europaea and Malus domestica individually showed significant reduction in duration of mobility (127.0 ± 0.44; 123.0 ± 0.44) in pole test as compare to piracetam (124.0 ± 0.70). Histopathological analysis revealed the significant extent of protection from ischemia after prophylactic treatments. Hence it is concluded that Olea europaea and Malus domestica are effective neuroprotective agents alone as compare to their combination.

Keywords: ischemia, Malus domestica, neuroprotection, Olea europaea

Procedia PDF Downloads 116
14629 Development of Concurrent Engineering through the Application of Software Simulations of Metal Production Processing and Analysis of the Effects of Application

Authors: D. M. Eric, D. Milosevic, F. D. Eric

Abstract:

Concurrent engineering technologies are a modern concept in manufacturing engineering. One of the key goals in designing modern technological processes is further reduction of production costs, both in the prototype and the preparatory part, as well as during the serial production. Thanks to many segments of concurrent engineering, these goals can be accomplished much more easily. In this paper, we give an overview of the advantages of using modern software simulations in relation to the classical aspects of designing technological processes of metal deformation. Significant savings are achieved thanks to the electronic simulation and software detection of all possible irregularities in the functional-working regime of the technological process. In order for the expected results to be optimal, it is necessary that the input parameters are very objective and that they reliably represent the values ​of these parameters in real conditions. Since it is a metal deformation treatment here, the particularly important parameters are the coefficient of internal friction between the working material and the tools, as well as the parameters related to the flow curve of the processing material. The paper will give a presentation for the experimental determination of some of these parameters.

Keywords: production technologies, metal processing, software simulations, effects of application

Procedia PDF Downloads 219
14628 An Improved Two-dimensional Ordered Statistical Constant False Alarm Detection

Authors: Weihao Wang, Zhulin Zong

Abstract:

Two-dimensional ordered statistical constant false alarm detection is a widely used method for detecting weak target signals in radar signal processing applications. The method is based on analyzing the statistical characteristics of the noise and clutter present in the radar signal and then using this information to set an appropriate detection threshold. In this approach, the reference cell of the unit to be detected is divided into several reference subunits. These subunits are used to estimate the noise level and adjust the detection threshold, with the aim of minimizing the false alarm rate. By using an ordered statistical approach, the method is able to effectively suppress the influence of clutter and noise, resulting in a low false alarm rate. The detection process involves a number of steps, including filtering the input radar signal to remove any noise or clutter, estimating the noise level based on the statistical characteristics of the reference subunits, and finally, setting the detection threshold based on the estimated noise level. One of the main advantages of two-dimensional ordered statistical constant false alarm detection is its ability to detect weak target signals in the presence of strong clutter and noise. This is achieved by carefully analyzing the statistical properties of the signal and using an ordered statistical approach to estimate the noise level and adjust the detection threshold. In conclusion, two-dimensional ordered statistical constant false alarm detection is a powerful technique for detecting weak target signals in radar signal processing applications. By dividing the reference cell into several subunits and using an ordered statistical approach to estimate the noise level and adjust the detection threshold, this method is able to effectively suppress the influence of clutter and noise and maintain a low false alarm rate.

Keywords: two-dimensional, ordered statistical, constant false alarm, detection, weak target signals

Procedia PDF Downloads 59
14627 Non-Invasive Characterization of the Mechanical Properties of Arterial Walls

Authors: Bruno RamaëL, GwenaëL Page, Catherine Knopf-Lenoir, Olivier Baledent, Anne-Virginie Salsac

Abstract:

No routine technique currently exists for clinicians to measure the mechanical properties of vascular walls non-invasively. Most of the data available in the literature come from traction or dilatation tests conducted ex vivo on native blood vessels. The objective of the study is to develop a non-invasive characterization technique based on Magnetic Resonance Imaging (MRI) measurements of the deformation of vascular walls under pulsating blood flow conditions. The goal is to determine the mechanical properties of the vessels by inverse analysis, coupling imaging measurements and numerical simulations of the fluid-structure interactions. The hyperelastic properties are identified using Solidworks and Ansys workbench (ANSYS Inc.) solving an optimization technique. The vessel of interest targeted in the study is the common carotid artery. In vivo MRI measurements of the vessel anatomy and inlet velocity profiles was acquired along the facial vascular network on a cohort of 30 healthy volunteers: - The time-evolution of the blood vessel contours and, thus, of the cross-section surface area was measured by 3D imaging angiography sequences of phase-contrast MRI. - The blood flow velocity was measured using a 2D CINE MRI phase contrast (PC-MRI) method. Reference arterial pressure waveforms were simultaneously measured in the brachial artery using a sphygmomanometer. The three-dimensional (3D) geometry of the arterial network was reconstructed by first creating an STL file from the raw MRI data using the open source imaging software ITK-SNAP. The resulting geometry was then transformed with Solidworks into volumes that are compatible with Ansys softwares. Tetrahedral meshes of the wall and fluid domains were built using the ANSYS Meshing software, with a near-wall mesh refinement method in the case of the fluid domain to improve the accuracy of the fluid flow calculations. Ansys Structural was used for the numerical simulation of the vessel deformation and Ansys CFX for the simulation of the blood flow. The fluid structure interaction simulations showed that the systolic and diastolic blood pressures of the common carotid artery could be taken as reference pressures to identify the mechanical properties of the different arteries of the network. The coefficients of the hyperelastic law were identified using Ansys Design model for the common carotid. Under large deformations, a stiffness of 800 kPa is measured, which is of the same order of magnitude as the Young modulus of collagen fibers. Areas of maximum deformations were highlighted near bifurcations. This study is a first step towards patient-specific characterization of the mechanical properties of the facial vessels. The method is currently applied on patients suffering from facial vascular malformations and on patients scheduled for facial reconstruction. Information on the blood flow velocity as well as on the vessel anatomy and deformability will be key to improve surgical planning in the case of such vascular pathologies.

Keywords: identification, mechanical properties, arterial walls, MRI measurements, numerical simulations

Procedia PDF Downloads 305
14626 Development of a Suitable Model for Energy Storage in Residential Buildings in Ahvaz Using Energy Plus Software

Authors: Farideh Azimi, Sam Vahedi Tafreshi

Abstract:

This research tries to study the residential buildings in Ahvaz, the common materials used, and the impact of passive methods of energy storage (as one of the most effective ways to reduce energy consumption in residential complexes) in order to achieve patterns for construction of residential buildings in Ahvaz conditions to reduce energy consumption. In this research, after studying Ahvaz conditions, the components of an existing building were simulated in Energy Plus software, and the climatic data of Ahvaz station was introduced to software. Then to achieve the most optimal conditions of energy consumption in Ahvaz conditions, each of the residential building elements was optimized. The results of simulation showed that using inactive materials and design including double glass, outside wall insulation, inverted roof, etc. in the buildings can reduce energy consumption in the hot and dry climate of Ahvaz. Among the parameters investigated, the inverted roof was the most effective energy saving pattern. According to the results of simulation of the entire building with the most optimal parameters, energy consumption can be saved by a mean of 12.51% in buildings of Ahvaz, and the obtained pattern can also be used in similar climates.

Keywords: residential buildings, thermal comfort, energy storage, Energy Plus software, Ahvaz

Procedia PDF Downloads 339
14625 Linkage Disequilibrium and Haplotype Blocks Study from Two High-Density Panels and a Combined Panel in Nelore Beef Cattle

Authors: Priscila A. Bernardes, Marcos E. Buzanskas, Luciana C. A. Regitano, Ricardo V. Ventura, Danisio P. Munari

Abstract:

Genotype imputation has been used to reduce genomic selections costs. In order to increase haplotype detection accuracy in methods that considers the linkage disequilibrium, another approach could be used, such as combined genotype data from different panels. Therefore, this study aimed to evaluate the linkage disequilibrium and haplotype blocks in two high-density panels before and after the imputation to a combined panel in Nelore beef cattle. A total of 814 animals were genotyped with the Illumina BovineHD BeadChip (IHD), wherein 93 animals (23 bulls and 70 progenies) were also genotyped with the Affymetrix Axion Genome-Wide BOS 1 Array Plate (AHD). After the quality control, 809 IHD animals (509,107 SNPs) and 93 AHD (427,875 SNPs) remained for analyses. The combined genotype panel (CP) was constructed by merging both panels after quality control, resulting in 880,336 SNPs. Imputation analysis was conducted using software FImpute v.2.2b. The reference (CP) and target (IHD) populations consisted of 23 bulls and 786 animals, respectively. The linkage disequilibrium and haplotype blocks studies were carried out for IHD, AHD, and imputed CP. Two linkage disequilibrium measures were considered; the correlation coefficient between alleles from two loci (r²) and the |D’|. Both measures were calculated using the software PLINK. The haplotypes' blocks were estimated using the software Haploview. The r² measurement presented different decay when compared to |D’|, wherein AHD and IHD had almost the same decay. For r², even with possible overestimation by the sample size for AHD (93 animals), the IHD presented higher values when compared to AHD for shorter distances, but with the increase of distance, both panels presented similar values. The r² measurement is influenced by the minor allele frequency of the pair of SNPs, which can cause the observed difference comparing the r² decay and |D’| decay. As a sum of the combinations between Illumina and Affymetrix panels, the CP presented a decay equivalent to a mean of these combinations. The estimated haplotype blocks detected for IHD, AHD, and CP were 84,529, 63,967, and 140,336, respectively. The IHD were composed by haplotype blocks with mean of 137.70 ± 219.05kb, the AHD with mean of 102.10kb ± 155.47, and the CP with mean of 107.10kb ± 169.14. The majority of the haplotype blocks of these three panels were composed by less than 10 SNPs, with only 3,882 (IHD), 193 (AHD) and 8,462 (CP) haplotype blocks composed by 10 SNPs or more. There was an increase in the number of chromosomes covered with long haplotypes when CP was used as well as an increase in haplotype coverage for short chromosomes (23-29), which can contribute for studies that explore haplotype blocks. In general, using CP could be an alternative to increase density and number of haplotype blocks, increasing the probability to obtain a marker close to a quantitative trait loci of interest.

Keywords: Bos taurus indicus, decay, genotype imputation, single nucleotide polymorphism

Procedia PDF Downloads 258
14624 Fast Switching Mechanism for Multicasting Failure in OpenFlow Networks

Authors: Alaa Allakany, Koji Okamura

Abstract:

Multicast technology is an efficient and scalable technology for data distribution in order to optimize network resources. However, in the IP network, the responsibility for management of multicast groups is distributed among network routers, which causes some limitations such as delays in processing group events, high bandwidth consumption and redundant tree calculation. Software Defined Networking (SDN) represented by OpenFlow presented as a solution for many problems, in SDN the control plane and data plane are separated by shifting the control and management to a remote centralized controller, and the routers are used as a forwarder only. In this paper we will proposed fast switching mechanism for solving the problem of link failure in multicast tree based on Tabu Search heuristic algorithm and modifying the functions of OpenFlow switch to fasts switch to the pack up sub tree rather than sending to the controller. In this work we will implement multicasting OpenFlow controller, this centralized controller is a core part in our multicasting approach, which is responsible for 1- constructing the multicast tree, 2- handling the multicast group events and multicast state maintenance. And finally modifying OpenFlow switch functions for fasts switch to pack up paths. Forwarders, forward the multicast packet based on multicast routing entries which were generated by the centralized controller. Tabu search will be used as heuristic algorithm for construction near optimum multicast tree and maintain multicast tree to still near optimum in case of join or leave any members from multicast group (group events).

Keywords: multicast tree, software define networks, tabu search, OpenFlow

Procedia PDF Downloads 244
14623 Sharing Tacit Knowledge: The Essence of Knowledge Management

Authors: Ayesha Khatun

Abstract:

In 21st century where markets are unstable, technologies rapidly proliferate, competitors multiply, products and services become obsolete almost overnight and customers demand low cost high value product, leveraging and harnessing knowledge is not just a potential source of competitive advantage rather a necessity in technology based and information intensive industries. Knowledge management focuses on leveraging the available knowledge and sharing the same among the individuals in the organization so that the employees can make best use of it towards achieving the organizational goals. Knowledge is not a discrete object. It is embedded in people and so difficult to transfer outside the immediate context that it becomes a major competitive advantage. However, internal transfer of knowledge among the employees is essential to maximize the use of knowledge available in the organization in an unstructured manner. But as knowledge is the source of competitive advantage for the organization it is also the source of competitive advantage for the individuals. People think that knowledge is power and sharing the same may lead to lose the competitive position. Moreover, the very nature of tacit knowledge poses many difficulties in sharing the same. But sharing tacit knowledge is the vital part of knowledge management process because it is the tacit knowledge which is inimitable. Knowledge management has been made synonymous with the use of software and technology leading to the management of explicit knowledge only ignoring personal interaction and forming of informal networks which are considered as the most successful means of sharing tacit knowledge. Factors responsible for effective sharing of tacit knowledge are grouped into –individual, organizational and technological factors. Different factors under each category have been identified. Creating a positive organizational culture, encouraging personal interaction, practicing reward system are some of the strategies that can help to overcome many of the barriers to effective sharing of tacit knowledge. Methodology applied here is completely secondary. Extensive review of relevant literature has been undertaken for the purpose.

Keywords: knowledge, tacit knowledge, knowledge management, sustainable competitive advantage, organization, knowledge sharing

Procedia PDF Downloads 380
14622 A Novel Approach towards Test Case Prioritization Technique

Authors: Kamna Solanki, Yudhvir Singh, Sandeep Dalal

Abstract:

Software testing is a time and cost intensive process. A scrutiny of the code and rigorous testing is required to identify and rectify the putative bugs. The process of bug identification and its consequent correction is continuous in nature and often some of the bugs are removed after the software has been launched in the market. This process of code validation of the altered software during the maintenance phase is termed as Regression testing. Regression testing ubiquitously considers resource constraints; therefore, the deduction of an appropriate set of test cases, from the ensemble of the entire gamut of test cases, is a critical issue for regression test planning. This paper presents a novel method for designing a suitable prioritization process to optimize fault detection rate and performance of regression test on predefined constraints. The proposed method for test case prioritization m-ACO alters the food source selection criteria of natural ants and is basically a modified version of Ant Colony Optimization (ACO). The proposed m-ACO approach has been coded in 'Perl' language and results are validated using three examples by computation of Average Percentage of Faults Detected (APFD) metric.

Keywords: regression testing, software testing, test case prioritization, test suite optimization

Procedia PDF Downloads 315
14621 In Agile Projects - Arithmetic Sequence is More Effective than Fibonacci Sequence to Use for Estimating the Implementation Effort of User Stories

Authors: Khaled Jaber

Abstract:

The estimation of effort in software development is a complex task. The traditional Waterfall approach used to develop software systems requires a lot of time to estimate the effort needed to implement user requirements. Agile manifesto, however, is currently more used in the industry than the Waterfall to develop software systems. In Agile, the user requirement is referred to as a user story. Agile teams mostly use the Fibonacci sequence 1, 2, 3, 5, 8, 11, etc. in estimating the effort needed to implement the user story. This work shows through analysis that the Arithmetic sequence, e.g., 3, 6, 9, 12, etc., is more effective than the Fibonacci sequence in estimating the user stories. This paper mathematically and visually proves the effectiveness of the Arithmetic sequence over the FB sequence.

Keywords: agie, scrum, estimation, fibonacci sequence

Procedia PDF Downloads 181