Search results for: software selection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6983

Search results for: software selection

6353 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations

Authors: Ram Mohan, Richard Haney, Ajit Kelkar

Abstract:

Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.

Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance

Procedia PDF Downloads 363
6352 Molecular and Phytochemical Fingerprinting of Anti-Cancer Drug Yielding Plants in South India

Authors: Alexis John de Britto

Abstract:

Studies were performed to select the superior genotypes based on intra-specific variations, caused by phytogeographical, climatic and edaphic parameters of three anti cancer drug yielding mangrove plants such as Acanthus ilicifolius L., Calophyllum inophyllum L. and Excoecaria agallocha L. using ISSR (Inter Simple Sequence Repeats) markers and phytochemical analysis such as preliminary phytochemical tests, TLC, HPTLC, HPLC and antioxidant tests. The plants were collected from five different geographical locations of the East Coast of south India. Genetic heterozygosity, Nei’s gene diversity, Shannon’s information index and Percentage of polymorphism between the populations were calculated using POPGENE software. Cluster analysis was performed using UPGMA algorithm. AMOVA and correlations between genetic diversity and soil factors were analyzed. Combining the molecular and phytochemical variations superior genotypes were selected. Conservation constraints and methods of efficient exploitation of the species are discussed.

Keywords: anti-cancer drug yielding plants, DNA fingerprinting, phytochemical analysis, selection of superior genotypes

Procedia PDF Downloads 330
6351 Confidence Envelopes for Parametric Model Selection Inference and Post-Model Selection Inference

Authors: I. M. L. Nadeesha Jayaweera, Adao Alex Trindade

Abstract:

In choosing a candidate model in likelihood-based modeling via an information criterion, the practitioner is often faced with the difficult task of deciding just how far up the ranked list to look. Motivated by this pragmatic necessity, we construct an uncertainty band for a generalized (model selection) information criterion (GIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood. This includes common special cases such as AIC & BIC. The method starts from the asymptotic normality of the GIC for the joint distribution of the candidate models in an independent and identically distributed (IID) data framework and proceeds by deriving the (asymptotically) exact distribution of the minimum. The calculation of an upper quantile for its distribution then involves the computation of multivariate Gaussian integrals, which is amenable to efficient implementation via the R package "mvtnorm". The performance of the methodology is tested on simulated data by checking the coverage probability of nominal upper quantiles and compared to the bootstrap. Both methods give coverages close to nominal for large samples, but the bootstrap is two orders of magnitude slower. The methodology is subsequently extended to two other commonly used model structures: regression and time series. In the regression case, we derive the corresponding asymptotically exact distribution of the minimum GIC invoking Lindeberg-Feller type conditions for triangular arrays and are thus able to similarly calculate upper quantiles for its distribution via multivariate Gaussian integration. The bootstrap once again provides a default competing procedure, and we find that similar comparison performance metrics hold as for the IID case. The time series case is complicated by far more intricate asymptotic regime for the joint distribution of the model GIC statistics. Under a Gaussian likelihood, the default in most packages, one needs to derive the limiting distribution of a normalized quadratic form for a realization from a stationary series. Under conditions on the process satisfied by ARMA models, a multivariate normal limit is once again achieved. The bootstrap can, however, be employed for its computation, whence we are once again in the multivariate Gaussian integration paradigm for upper quantile evaluation. Comparisons of this bootstrap-aided semi-exact method with the full-blown bootstrap once again reveal a similar performance but faster computation speeds. One of the most difficult problems in contemporary statistical methodological research is to be able to account for the extra variability introduced by model selection uncertainty, the so-called post-model selection inference (PMSI). We explore ways in which the GIC uncertainty band can be inverted to make inferences on the parameters. This is being attempted in the IID case by pivoting the CDF of the asymptotically exact distribution of the minimum GIC. For inference one parameter at a time and a small number of candidate models, this works well, whence the attained PMSI confidence intervals are wider than the MLE-based Wald, as expected.

Keywords: model selection inference, generalized information criteria, post model selection, Asymptotic Theory

Procedia PDF Downloads 88
6350 Diffusion Adaptation Strategies for Distributed Estimation Based on the Family of Affine Projection Algorithms

Authors: Mohammad Shams Esfand Abadi, Mohammad Ranjbar, Reza Ebrahimpour

Abstract:

This work presents the distributed processing solution problem in a diffusion network based on the adapt then combine (ATC) and combine then adapt (CTA)selective partial update normalized least mean squares (SPU-NLMS) algorithms. Also, we extend this approach to dynamic selection affine projection algorithm (DS-APA) and ATC-DS-APA and CTA-DS-APA are established. The purpose of ATC-SPU-NLMS and CTA-SPU-NLMS algorithm is to reduce the computational complexity by updating the selected blocks of weight coefficients at every iteration. In CTA-DS-APA and ATC-DS-APA, the number of the input vectors is selected dynamically. Diffusion cooperation strategies have been shown to provide good performance based on these algorithms. The good performance of introduced algorithm is illustrated with various experimental results.

Keywords: selective partial update, affine projection, dynamic selection, diffusion, adaptive distributed networks

Procedia PDF Downloads 707
6349 Method for Selecting and Prioritising Smart Services in Manufacturing Companies

Authors: Till Gramberg, Max Kellner, Erwin Gross

Abstract:

This paper presents a comprehensive investigation into the topic of smart services and IIoT-Platforms, focusing on their selection and prioritization in manufacturing organizations. First, a literature review is conducted to provide a basic understanding of the current state of research in the area of smart services. Based on discussed and established definitions, a definition approach for this paper is developed. In addition, value propositions for smart services are identified based on the literature and expert interviews. Furthermore, the general requirements for the provision of smart services are presented. Subsequently, existing approaches for the selection and development of smart services are identified and described. In order to determine the requirements for the selection of smart services, expert opinions from successful companies that have already implemented smart services are collected through semi-structured interviews. Based on the results, criteria for the evaluation of existing methods are derived. The existing methods are then evaluated according to the identified criteria. Furthermore, a novel method for the selection of smart services in manufacturing companies is developed, taking into account the identified criteria and the existing approaches. The developed concept for the method is verified in expert interviews. The method includes a collection of relevant smart services identified in the literature. The actual relevance of the use cases in the industrial environment was validated in an online survey. The required data and sensors are assigned to the smart service use cases. The value proposition of the use cases is evaluated in an expert workshop using different indicators. Based on this, a comparison is made between the identified value proposition and the required data, leading to a prioritization process. The prioritization process follows an established procedure for evaluating technical decision-making processes. In addition to the technical requirements, the prioritization process includes other evaluation criteria such as the economic benefit, the conformity of the new service offering with the company strategy, or the customer retention enabled by the smart service. Finally, the method is applied and validated in an industrial environment. The results of these experiments are critically reflected upon and an outlook on future developments in the area of smart services is given. This research contributes to a deeper understanding of the selection and prioritization process as well as the technical considerations associated with smart service implementation in manufacturing organizations. The proposed method serves as a valuable guide for decision makers, helping them to effectively select the most appropriate smart services for their specific organizational needs.

Keywords: smart services, IIoT, industrie 4.0, IIoT-platform, big data

Procedia PDF Downloads 88
6348 The Role of Knowledge Management in Global Software Engineering

Authors: Samina Khalid, Tehmina Khalil, Smeea Arshad

Abstract:

Knowledge management is essential ingredient of successful coordination in globally distributed software engineering. Various frameworks, KMSs, and tools have been proposed to foster coordination and communication between virtual teams but practical implementation of these solutions has not been found. Organizations have to face challenges to implement knowledge management system. For this purpose at first, a literature review is arranged to investigate about challenges that restrict organizations to implement KMS and then by taking in account these challenges a problem of need of integrated solution in the form of standardized KMS that can easily store tacit and explicit knowledge, has traced down to facilitate coordination and collaboration among virtual teams. Literature review has been already shown that knowledge is a complex perception with profound meanings, and one of the most important resources that contributes to the competitive advantage of an organization. In order to meet the different challenges caused by not properly managing knowledge related to projects among virtual teams in GSE, we suggest making use of the cloud computing model. In this research a distributed architecture to support KM storage is proposed called conceptual framework of KM as a service in cloud. Framework presented is enhanced and conceptual framework of KM is embedded into that framework to store projects related knowledge for future use.

Keywords: management, Globsl software development, global software engineering

Procedia PDF Downloads 526
6347 Relay Node Selection Algorithm for Cooperative Communications in Wireless Networks

Authors: Sunmyeng Kim

Abstract:

IEEE 802.11a/b/g standards support multiple transmission rates. Even though the use of multiple transmission rates increase the WLAN capacity, this feature leads to the performance anomaly problem. Cooperative communication was introduced to relieve the performance anomaly problem. Data packets are delivered to the destination much faster through a relay node with high rate than through direct transmission to the destination at low rate. In the legacy cooperative protocols, a source node chooses a relay node only based on the transmission rate. Therefore, they are not so feasible in multi-flow environments since they do not consider the effect of other flows. To alleviate the effect, we propose a new relay node selection algorithm based on the transmission rate and channel contention level. Performance evaluation is conducted using simulation, and shows that the proposed protocol significantly outperforms the previous protocol in terms of throughput and delay.

Keywords: cooperative communications, MAC protocol, relay node, WLAN

Procedia PDF Downloads 331
6346 New Advanced Medical Software Technology Challenges and Evolution of the Regulatory Framework in Expert Software, Artificial Intelligence, and Machine Learning

Authors: Umamaheswari Shanmugam, Silvia Ronchi, Radu Vornicu

Abstract:

Software, artificial intelligence, and machine learning can improve healthcare through innovative and advanced technologies that are able to use the large amount and variety of data generated during healthcare services every day. As we read the news, over 500 machine learning or other artificial intelligence medical devices have now received FDA clearance or approval, the first ones even preceding the year 2000. One of the big advantages of these new technologies is the ability to get experience and knowledge from real-world use and to continuously improve their performance. Healthcare systems and institutions can have a great benefit because the use of advanced technologies improves the same time efficiency and efficacy of healthcare. Software-defined as a medical device, is stand-alone software that is intended to be used for patients for one or more of these specific medical intended uses: - diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of a disease, any other health conditions, replacing or modifying any part of a physiological or pathological process–manage the received information from in vitro specimens derived from the human samples (body) and without principal main action of its principal intended use by pharmacological, immunological or metabolic definition. Software qualified as medical devices must comply with the general safety and performance requirements applicable to medical devices. These requirements are necessary to ensure high performance and quality and also to protect patients’ safety. The evolution and the continuous improvement of software used in healthcare must take into consideration the increase in regulatory requirements, which are becoming more complex in each market. The gap between these advanced technologies and the new regulations is the biggest challenge for medical device manufacturers. Regulatory requirements can be considered a market barrier, as they can delay or obstacle the device approval, but they are necessary to ensure performance, quality, and safety, and at the same time, they can be a business opportunity if the manufacturer is able to define in advance the appropriate regulatory strategy. The abstract will provide an overview of the current regulatory framework, the evolution of the international requirements, and the standards applicable to medical device software in the potential market all over the world.

Keywords: artificial intelligence, machine learning, SaMD, regulatory, clinical evaluation, classification, international requirements, MDR, 510k, PMA, IMDRF, cyber security, health care systems.

Procedia PDF Downloads 89
6345 B4A Is One of the Best Programming Software for Surveyor Engineers

Authors: Ali Mohammadi

Abstract:

Many engineers use the programs that are installed on the computer, but with the arrival of the mobile phone and the possibility of designing apps, many Android programs can be designed similar to the programs that are installed on the computer, and from the mobile phone, in addition to communication Telephone and photography show a more practical use. Engineers are one of the groups that can use specialized apps to have less need to go to the office and computer, and b4a can be considered one of the simplest software for designing apps. This article introduces a number of surveying apps designed using b4a and the impact that using these apps has on productivity in this field of engineering.

Keywords: app, tunnel, total station, map

Procedia PDF Downloads 48
6344 Analytic Hierarchy Process

Authors: Hadia Rafi

Abstract:

To make any decision in any work/task/project it involves many factors that needed to be looked. The analytic Hierarchy process (AHP) is based on the judgments of experts to derive the required results this technique measures the intangibles and then by the help of judgment and software analysis the comparisons are made which shows how much a certain element/unit leads another. AHP includes how an inconsistent judgment should be made consistent and how the judgment should be improved when possible. The Priority scales are obtained by multiplying them with the priority of their parent node and after that they are added.

Keywords: AHP, priority scales, parent node, software analysis

Procedia PDF Downloads 406
6343 Investigation of the Effects of Processing Parameters on Pla Based 3D Printed Tensile Samples

Authors: Saifullah Karimullah

Abstract:

Additive manufacturing techniques are becoming more common with the latest technological advancements. It is composed to bring a revolution in the way products are designed, planned, manufactured, and distributed to end users. Fused deposition modeling (FDM) based 3D printing is one of those promising aspects that have revolutionized the prototyping processes. The purpose of this design and study project is to design a customized laboratory-scale FDM-based 3D printer from locally available sources. The primary goal is to design and fabricate the FDM-based 3D printer. After the fabrication, a tensile test specimen would be designed in Solid Works or [Creo computer-aided design (CAD)] software. A .stl file is generated of the tensile test specimen through slicing software and the G-codes are inserted via a computer for the test specimen to be printed. Different parameters were under studies like printing speed, layer thickness and infill density of the printed object. Some parameters were kept constant such as temperature, extrusion rate, raster orientation etc. Different tensile test specimens were printed for a different sets of parameters of the FDM-based 3d printer. The tensile test specimen were subjected to tensile tests using a universal testing machine (UTM). Design Expert software has been used for analyses, So Different results were obtained from the different tensile test specimens. The best, average and worst specimen were also observed under a compound microscope to investigate the layer bonding in between.

Keywords: additive manufacturing techniques, 3D printing, CAD software, UTM machine

Procedia PDF Downloads 103
6342 Adaptive Multipath Mitigation Acquisition Approach for Global Positioning System Software Receivers

Authors: Animut Meseret Simachew

Abstract:

Parallel Code Phase Search Acquisition (PCSA) Algorithm has been considered as a promising method in GPS software receivers for detection and estimation of the accurate correlation peak between the received Global Positioning System (GPS) signal and locally generated replicas. GPS signal acquisition in highly dense multipath environments is the main research challenge. In this work, we proposed a robust variable step-size (RVSS) PCSA algorithm based on fast frequency transform (FFT) filtering technique to mitigate short time delay multipath signals. Simulation results reveal the effectiveness of the proposed algorithm over the conventional PCSA algorithm. The proposed RVSS-PCSA algorithm equalizes the received carrier wiped-off signal with locally generated C/A code.

Keywords: adaptive PCSA, detection and estimation, GPS signal acquisition, GPS software receiver

Procedia PDF Downloads 117
6341 Factors Influencing Student's Decision to Pursue a Hospitality and Tourism Program

Authors: Zeenath Solih

Abstract:

The aim of the study is to analyze the factors that influence the decision to pursue a hospitality and tourism program for students of Maldives when pursuing higher education options. This research would further explore the implications and relationship between the universities and students. Quantitative research method will be used to demonstrate the hypothesis and achieve the objectives of this research, a questionnaire consisting of 30 closed questions will be used which will be analyzed based on SPSS18 software to handle and extract the data.10 public school and 3 private schools with secondary education and 3 universities with higher education facilities and a total of 500 students participated in the survey. The findings include selection criteria for decision making for higher studies being the university’s reputation, excellence and quality of educational program, the preference of pursuing further studies from a public over private universities and the academic, cultural and socio demographic factors that influence the students choice of program and university. Finally the study will provide valuable insight to how universities need to market their programs to attract the right students.

Keywords: choice criteria, higher education, hospitality and tourism studies, information sources

Procedia PDF Downloads 269
6340 Merging Sequence Diagrams Based Slicing

Authors: Bouras Zine Eddine, Talai Abdelouaheb

Abstract:

The need to merge software artifacts seems inherent to modern software development. Distribution of development over several teams and breaking tasks into smaller, more manageable pieces are an effective means to deal with the kind of complexity. In each case, the separately developed artifacts need to be assembled as efficiently as possible into a consistent whole in which the parts still function as described. Also, earlier changes are introduced into the life cycle and easier is their management by designers. Interaction-based specifications such as UML sequence diagrams have been found effective in this regard. As a result, sequence diagrams can be used not only for capturing system behaviors but also for merging changes in order to create a new version. The objective of this paper is to suggest a new approach to deal with the problem of software merging at the level of sequence diagrams by using the concept of dependence analysis that captures, formally, all mapping and differences between elements of sequence diagrams and serves as a key concept to create a new version of sequence diagram.

Keywords: system behaviors, sequence diagram merging, dependence analysis, sequence diagram slicing

Procedia PDF Downloads 340
6339 Reliability-Based Design of an Earth Slope Taking into Account Unsaturated Soil Properties

Authors: A. T. Siacara, A. T. Beck, M. M. Futai

Abstract:

This paper shows how accurately and efficiently reliability analyses of geotechnical installations can be performed by directly coupling geotechnical software with a reliability solver. An earth slope is used as the study object. The limit equilibrium method of Morgenstern-Price is used to calculate factors of safety and find the critical slip surface. The deterministic software package Seep/W and Slope/W is coupled with the StRAnD reliability software. Reliability indexes of critical probabilistic surfaces are evaluated by the first-order reliability methods (FORM). By means of sensitivity analysis, the effective cohesion (c') is found to be the most relevant uncertain geotechnical parameter for slope equilibrium. The slope was tested using different geometries, taking into account unsaturated soil properties. Finally, a critical slip surface, identified in terms of minimum factor of safety, is shown here not to be the critical surface in terms of reliability index.

Keywords: slope, unsaturated, reliability, safety, seepage

Procedia PDF Downloads 149
6338 Using Wiki for Enhancing the Knowledge Transfer to Newcomers: An Experience Report

Authors: Hualter Oliveira Barbosa, Raquel Feitosa do Vale Cunha, Erika Muniz dos Santos, Fernanda Belmira Souza, Fabio Sousa, Luis Henrique Pascareli, Franciney de Oliveira Lima, Ana Cláudia Reis da Silva, Christiane Moreira de Almeida

Abstract:

Software development is intrinsic human-based knowledge-intensive. Due to globalization, software development has become a complex challenge and we usually face barriers related to knowledge management, team building, costly testing processes, especially in distributed settings. For this reason, several approaches have been proposed to minimize barriers caused by geographic distance. In this paper, we present as we use experimental studies to improve our knowledge management process using the Wiki system. According to the results, it was possible to identify learning preferences from our software projects leader team, organize and improve the learning experience of our Wiki and; facilitate collaboration by newcomers to improve Wiki with new contents available in the Wiki.

Keywords: mobile product, knowledge transfer, knowledge management process, wiki, GSD

Procedia PDF Downloads 175
6337 Optimising Apparel Digital Production in Industrial Clusters

Authors: Minji Seo

Abstract:

Fashion stakeholders are becoming increasingly aware of technological innovation in manufacturing. In 2020, the COVID-19 pandemic caused transformations in working patterns, such as working remotely rather thancommuting. To enable smooth remote working, 3D fashion design software is being adoptedas the latest trend in design and production. The majority of fashion designers, however, are still resistantto this change. Previous studies on 3D fashion design software solely highlighted the beneficial and detrimental factors of adopting design innovations. They lacked research on the relationship between resistance factors and the adoption of innovation. These studies also fell short of exploringthe perspectives of users of these innovations. This paper aims to investigate the key drivers and barriers of employing 3D fashion design software as wellas to explore the challenges faced by designers.It also toucheson the governmental support for digital manufacturing in Seoul, South Korea, and London, the United Kingdom. By conceptualising local support, this study aims to provide a new path for industrial clusters to optimise digital apparel manufacturing. The study uses a mixture of quantitative and qualitative approaches. Initially, it reflects a survey of 350 samples, fashion designers, on innovation resistance factors of 3D fashion design software and the effectiveness of local support. In-depth interviews with 30 participants provide a better understanding of designers’ aspects of the benefits and obstacles of employing 3D fashion design software. The key findings of this research are the main barriers to employing 3D fashion design software in fashion production. The cultural characteristics and interviews resultsare used to interpret the survey results. The findings of quantitative data examine the main resistance factors to adopting design innovations. The dominant obstacles are: the cost of software and its complexity; lack of customers’ interest in innovation; lack of qualified personnel, and lack of knowledge. The main difference between Seoul and London is the attitudes towards government support. Compared to the UK’s fashion designers, South Korean designers emphasise that government support is highly relevant to employing 3D fashion design software. The top-down and bottom-up policy implementation approach distinguishes the perception of government support. Compared to top-down policy approaches in South Korea, British fashion designers based on employing bottom-up approaches are reluctant to receive government support. The findings of this research will contribute to generating solutions for local government and the optimisation of use of 3D fashion design software in fashion industrial clusters.

Keywords: digital apparel production, industrial clusters, innovation resistance, 3D fashion design software, manufacturing, innovation, technology, digital manufacturing, innovative fashion design process

Procedia PDF Downloads 102
6336 3D Classification Optimization of Low-Density Airborne Light Detection and Ranging Point Cloud by Parameters Selection

Authors: Baha Eddine Aissou, Aichouche Belhadj Aissa

Abstract:

Light detection and ranging (LiDAR) is an active remote sensing technology used for several applications. Airborne LiDAR is becoming an important technology for the acquisition of a highly accurate dense point cloud. A classification of airborne laser scanning (ALS) point cloud is a very important task that still remains a real challenge for many scientists. Support vector machine (SVM) is one of the most used statistical learning algorithms based on kernels. SVM is a non-parametric method, and it is recommended to be used in cases where the data distribution cannot be well modeled by a standard parametric probability density function. Using a kernel, it performs a robust non-linear classification of samples. Often, the data are rarely linearly separable. SVMs are able to map the data into a higher-dimensional space to become linearly separable, which allows performing all the computations in the original space. This is one of the main reasons that SVMs are well suited for high-dimensional classification problems. Only a few training samples, called support vectors, are required. SVM has also shown its potential to cope with uncertainty in data caused by noise and fluctuation, and it is computationally efficient as compared to several other methods. Such properties are particularly suited for remote sensing classification problems and explain their recent adoption. In this poster, the SVM classification of ALS LiDAR data is proposed. Firstly, connected component analysis is applied for clustering the point cloud. Secondly, the resulting clusters are incorporated in the SVM classifier. Radial basic function (RFB) kernel is used due to the few numbers of parameters (C and γ) that needs to be chosen, which decreases the computation time. In order to optimize the classification rates, the parameters selection is explored. It consists to find the parameters (C and γ) leading to the best overall accuracy using grid search and 5-fold cross-validation. The exploited LiDAR point cloud is provided by the German Society for Photogrammetry, Remote Sensing, and Geoinformation. The ALS data used is characterized by a low density (4-6 points/m²) and is covering an urban area located in residential parts of the city Vaihingen in southern Germany. The class ground and three other classes belonging to roof superstructures are considered, i.e., a total of 4 classes. The training and test sets are selected randomly several times. The obtained results demonstrated that a parameters selection can orient the selection in a restricted interval of (C and γ) that can be further explored but does not systematically lead to the optimal rates. The SVM classifier with hyper-parameters is compared with the most used classifiers in literature for LiDAR data, random forest, AdaBoost, and decision tree. The comparison showed the superiority of the SVM classifier using parameters selection for LiDAR data compared to other classifiers.

Keywords: classification, airborne LiDAR, parameters selection, support vector machine

Procedia PDF Downloads 147
6335 Methods Used to Perform Requirements Elicitation for Healthcare Software Development

Authors: Tang Jiacheng, Fang Tianyu, Liu Yicen, Xiang Xingzhou

Abstract:

The proportion of healthcare services is increasing throughout the globe. The convergence of mobile technology is driving new business opportunities, innovations in healthcare service delivery and the promise of a better life tomorrow for different populations with various healthcare needs. One of the most important phases for the combination of health care and mobile applications is to elicit requirements correctly. In this paper, four articles from different research directions with four topics on healthcare were detailed analyzed and summarized. We identified the underlying problems in guidance to develop mobile applications to provide healthcare service for Older adults, Women in menopause, Patients undergoing covid. These case studies cover several elicitation methods: survey, prototyping, focus group interview and questionnaire. And the effectiveness of these methods was analyzed along with the advantages and limitations of these methods, which is beneficial to adapt the elicitation methods for future software development process.

Keywords: healthcare, software requirement elicitation, mobile applications, prototyping, focus group interview

Procedia PDF Downloads 148
6334 Surgical Hip Dislocation of Femoroacetabular Impingement: Survivorship and Functional Outcomes at 10 Years

Authors: L. Hoade, O. O. Onafowokan, K. Anderson, G. E. Bartlett, E. D. Fern, M. R. Norton, R. G. Middleton

Abstract:

Aims: Femoroacetabular impingement (FAI) was first recognised as a potential driver for hip pain at the turn of the last millennium. While there is an increasing trend towards surgical management of FAI by arthroscopic means, open surgical hip dislocation and debridement (SHD) remains the Gold Standard of care in terms of reported outcome measures. (1) Long-term functional and survivorship outcomes of SHD as a treatment for FAI are yet to be sufficiently reported in the literature. This study sets out to help address this imbalance. Methods: We undertook a retrospective review of our institutional database for all patients who underwent SHD for FAI between January 2003 and December 2008. A total of 223 patients (241 hips) were identified and underwent a ten year review with a standardised radiograph and patient-reported outcome measures questionnaire. The primary outcome measure of interest was survivorship, defined as progression to total hip arthroplasty (THA). Negative predictive factors were analysed. Secondary outcome measures of interest were survivorship to further (non-arthroplasty) surgery, functional outcomes as reflected by patient reported outcome measure scores (PROMS) scores, and whether a learning curve could be identified. Results: The final cohort consisted of 131 females and 110 males, with a mean age of 34 years. There was an overall native hip joint survival rate of 85.4% at ten years. Those who underwent a THA were significantly older at initial surgery, had radiographic evidence of preoperative osteoarthritis and pre- and post-operative acetabular undercoverage. In those whom had not progressed to THA, the average Non-arthritic Hip Score and Oxford Hip Score at ten year follow-up were 72.3% and 36/48, respectively, and 84% still deemed their surgery worthwhile. A learning curve was found to exist that was predicated on case selection rather than surgical technique. Conclusion: This is only the second study to evaluate the long-term outcomes (beyond ten years) of SHD for FAI and the first outside the originating centre. Our results suggest that, with correct patient selection, this remains an operation with worthwhile outcomes at ten years. How the results of open surgery compared to those of arthroscopy remains to be answered. While these results precede the advent of collison software modelling tools, this data helps set a benchmark for future comparison of other techniques effectiveness at the ten year mark.

Keywords: femoroacetabular impingement, hip pain, surgical hip dislocation, hip debridement

Procedia PDF Downloads 84
6333 A Survey on the Status of Test Automation

Authors: Andrei Contan, Richard Torkar

Abstract:

Aim: The process of test automation and its practices in industry have to be better understood, both for the industry itself and for the research community. Method: We conducted a quantitative industry survey by asking IT professionals to answer questions related to the area of test automation. Results: Test automation needs and practices vary greatly between organizations at different stages of the software development life cycle. Conclusions: Most of the findings are general test automation challenges and are specific to small- to medium-sized companies, developing software applications in the web, desktop or mobile domain.

Keywords: survey, testing, test automation, status of test automation

Procedia PDF Downloads 658
6332 Implementation of Big Data Concepts Led by the Business Pressures

Authors: Snezana Savoska, Blagoj Ristevski, Violeta Manevska, Zlatko Savoski, Ilija Jolevski

Abstract:

Big data is widely accepted by the pharmaceutical companies as a result of business demands create through legal pressure. Pharmaceutical companies have many legal demands as well as standards’ demands and have to adapt their procedures to the legislation. To manage with these demands, they have to standardize the usage of the current information technology and use the latest software tools. This paper highlights some important aspects of experience with big data projects implementation in a pharmaceutical Macedonian company. These projects made improvements of their business processes by the help of new software tools selected to comply with legal and business demands. They use IT as a strategic tool to obtain competitive advantage on the market and to reengineer the processes towards new Internet economy and quality demands. The company is required to manage vast amounts of structured as well as unstructured data. For these reasons, they implement projects for emerging and appropriate software tools which have to deal with big data concepts accepted in the company.

Keywords: big data, unstructured data, SAP ERP, documentum

Procedia PDF Downloads 271
6331 CFD Analysis of an Aft Sweep Wing in Subsonic Flow and Making Analogy with Roskam Methods

Authors: Ehsan Sakhaei, Ali Taherabadi

Abstract:

In this study, an aft sweep wing with specific characteristic feature was analysis with CFD method in Fluent software. In this analysis wings aerodynamic coefficient was calculated in different rake angle and wing lift curve slope to rake angle was achieved. Wing section was selected among NACA airfoils version 6. The sweep angle of wing is 15 degree, aspect ratio 8 and taper ratios 0.4. Designing and modeling this wing was done in CATIA software. This model was meshed in Gambit software and its three dimensional analysis was done in Fluent software. CFD methods used here were based on pressure base algorithm. SIMPLE technique was used for solving Navier-Stokes equation and Spalart-Allmaras model was utilized to simulate three dimensional wing in air. Roskam method is one of the common and most used methods for determining aerodynamics parameters in the field of airplane designing. In this study besides CFD analysis, an advanced aircraft analysis was used for calculating aerodynamic coefficient using Roskam method. The results of CFD were compared with measured data acquired from Roskam method and authenticity of relation was evaluated. The results and comparison showed that in linear region of lift curve there is a minor difference between aerodynamics parameter acquired from CFD to relation present by Roskam.

Keywords: aft sweep wing, CFD method, fluent, Roskam, Spalart-Allmaras model

Procedia PDF Downloads 504
6330 Flow Conservation Framework for Monitoring Software Defined Networks

Authors: Jesús Antonio Puente Fernández, Luis Javier Garcia Villalba

Abstract:

New trends on streaming videos such as series or films require a high demand of network resources. This fact results in a huge problem within traditional IP networks due to the rigidity of its architecture. In this way, Software Defined Networks (SDN) is a new concept of network architecture that intends to be more flexible and it simplifies the management in networks with respect to the existing ones. These aspects are possible due to the separation of control plane (controller) and data plane (switches). Taking the advantage of this separated control, it is easy to deploy a monitoring tool independent of device vendors since the existing ones are dependent on the installation of specialized and expensive hardware. In this paper, we propose a framework that optimizes the traffic monitoring in SDN networks that decreases the number of monitoring queries to improve the network traffic and also reduces the overload. The performed experiments (with and without the optimization) using a video streaming delivery between two hosts demonstrate the feasibility of our monitoring proposal.

Keywords: optimization, monitoring, software defined networking, statistics, query

Procedia PDF Downloads 331
6329 True Single SKU Script: Applying the Automated Test to Set Software Properties in a Global Software Development Environment

Authors: Antonio Brigido, Maria Meireles, Francisco Barros, Gaspar Mota, Fernanda Terra, Lidia Melo, Marcelo Reis, Camilo Souza

Abstract:

As the globalization of the software process advances, companies are increasingly committed to improving software development technologies across multiple locations. On the other hand, working with teams distributed in different locations also raises new challenges. In this sense, automated processes can help to improve the quality of process execution. Therefore, this work presents the development of a tool called TSS Script that automates the sample preparation process for carrier requirements validation tests. The objective of the work is to obtain significant gains in execution time and reducing errors in scenario preparation. To estimate the gains over time, the executions performed in an automated and manual way were timed. In addition, a questionnaire-based survey was developed to discover new requirements and improvements to include in this automated support. The results show an average gain of 46.67% of the total hours worked, referring to sample preparation. The use of the tool avoids human errors, and for this reason, it adds greater quality and speed to the process. Another relevant factor is the fact that the tester can perform other activities in parallel with sample preparation.

Keywords: Android, GSD, automated testing tool, mobile products

Procedia PDF Downloads 317
6328 Simulation of Government Management Model to Increase Financial Productivity System Using Govpilot

Authors: Arezou Javadi

Abstract:

The use of algorithmic models dependent on software calculations and simulation of new government management assays with the help of specialized software had increased the productivity and efficiency of the government management system recently. This has caused the management approach to change from the old bitch & fix model, which has low efficiency and less usefulness, to the capable management model with higher efficiency called the partnership with resident model. By using Govpilot TM software, the relationship between people in a system and the government was examined. The method of two tailed interaction was the outsourcing of a goal in a system, which is formed in the order of goals, qualified executive people, optimal executive model, and finally, summarizing additional activities at the different statistical levels. The results showed that the participation of people in a financial implementation system with a statistical potential of P≥5% caused a significant increase in investment and initial capital in the government system with maximum implement project in a smart government.

Keywords: machine learning, financial income, statistical potential, govpilot

Procedia PDF Downloads 88
6327 Simulation of Government Management Model to Increase Financial Productivity System Using Govpilot

Authors: Arezou Javadi

Abstract:

The use of algorithmic models dependent on software calculations and simulation of new government management assays with the help of specialized software had increased the productivity and efficiency of the government management system recently. This has caused the management approach to change from the old bitch & fix model, which has low efficiency and less usefulness, to the capable management model with higher efficiency called the partnership with resident model. By using Govpilot TM software, the relationship between people in a system and the government was examined. The method of two tailed interaction was the outsourcing of a goal in a system, which is formed in the order of goals, qualified executive people, optimal executive model, and finally, summarizing additional activities at the different statistical levels. The results showed that the participation of people in a financial implementation system with a statistical potential of P≥5% caused a significant increase in investment and initial capital in the government system with maximum implement project in a smart government.

Keywords: machine learning, financial income, statistical potential, govpilot

Procedia PDF Downloads 70
6326 PMEL Marker Identification of Dark and Light Feather Colours in Local Canary

Authors: Mudawamah Mudawamah, Muhammad Z. Fadli, Gatot Ciptadi, Aulanni’am

Abstract:

Canary breeders have spread throughout Indonesian regions for the low-middle society and become an income source for them. The interesting phenomenon of the canary market is the feather colours become one of determining factor for the price. The advantages of this research were contributed to the molecular database as a base of selection and mating for the Indonesia canary breeder. The research method was experiment with the genome obtained from canary blood isolation. The genome did the PCR amplification with PMEL marker followed by sequencing. Canaries were used 24 heads of light and dark colour feathers. Research data analyses used BioEdit and Network 4.6.0.0 software. The results showed that all samples were amplification with PMEL gene with 500 bp fragment length. In base sequence of 40 was found Cytosine(C) in the light colour canaries, while the dark colour canaries was obtained Thymine (T) in same base sequence. Sequence results had 286-415 bp fragment and 10 haplotypes. The conclusions were the PMEL gene (gene of white pigment) was likely to be used PMEL gene to detect molecular genetic variation of dark and light colour feather.

Keywords: canary, haplotype, PMEL, sequence

Procedia PDF Downloads 237
6325 Drape Simulation by Commercial Software and Subjective Assessment of Virtual Drape

Authors: Evrim Buyukaslan, Simona Jevsnik, Fatma Kalaoglu

Abstract:

Simulation of fabrics is more difficult than any other simulation due to complex mechanics of fabrics. Most of the virtual garment simulation software use mass-spring model and incorporate fabric mechanics into simulation models. The accuracy and fidelity of these virtual garment simulation software is a question mark. Drape is a subjective phenomenon and evaluation of drape has been studied since 1950’s. On the other hand, fabric and garment simulation is relatively new. Understanding drape perception of subjects when looking at fabric simulations is critical as virtual try-on becomes more of an issue by enhanced online apparel sales. Projected future of online apparel retailing is that users may view their avatars and try-on the garment on their avatars in the virtual environment. It is a well-known fact that users will not be eager to accept this innovative technology unless it is realistic enough. Therefore, it is essential to understand what users see when they are displaying fabrics in a virtual environment. Are they able to distinguish the differences between various fabrics in virtual environment? The purpose of this study is to investigate human perception when looking at a virtual fabric and determine the most visually noticeable drape parameter. To this end, five different fabrics are mechanically tested, and their drape simulations are generated by commercial garment simulation software (Optitex®). The simulation images are processed by an image analysis software to calculate drape parameters namely; drape coefficient, node severity, and peak angles. A questionnaire is developed to evaluate drape properties subjectively in a virtual environment. Drape simulation images are shown to 27 subjects and asked to rank the samples according to their questioned drape property. The answers are compared to the calculated drape parameters. The results show that subjects are quite sensitive to drape coefficient changes while they are not very sensitive to changes in node dimensions and node distributions.

Keywords: drape simulation, drape evaluation, fabric mechanics, virtual fabric

Procedia PDF Downloads 338
6324 Performance and Emission Prediction in a Biodiesel Engine Fuelled with Honge Methyl Ester Using RBF Neural Networks

Authors: Shiva Kumar, G. S. Vijay, Srinivas Pai P., Shrinivasa Rao B. R.

Abstract:

In the present study RBF neural networks were used for predicting the performance and emission parameters of a biodiesel engine. Engine experiments were carried out in a 4 stroke diesel engine using blends of diesel and Honge methyl ester as the fuel. Performance parameters like BTE, BSEC, Tech and emissions from the engine were measured. These experimental results were used for ANN modeling. RBF center initialization was done by random selection and by using Clustered techniques. Network was trained by using fixed and varying widths for the RBF units. It was observed that RBF results were having a good agreement with the experimental results. Networks trained by using clustering technique gave better results than using random selection of centers in terms of reduced MRE and increased prediction accuracy. The average MRE for the performance parameters was 3.25% with the prediction accuracy of 98% and for emissions it was 10.4% with a prediction accuracy of 80%.

Keywords: radial basis function networks, emissions, performance parameters, fuzzy c means

Procedia PDF Downloads 558