Search results for: software domains
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5443

Search results for: software domains

4933 Transforming Data into Knowledge: Mathematical and Statistical Innovations in Data Analytics

Authors: Zahid Ullah, Atlas Khan

Abstract:

The rapid growth of data in various domains has created a pressing need for effective methods to transform this data into meaningful knowledge. In this era of big data, mathematical and statistical innovations play a crucial role in unlocking insights and facilitating informed decision-making in data analytics. This abstract aims to explore the transformative potential of these innovations and their impact on converting raw data into actionable knowledge. Drawing upon a comprehensive review of existing literature, this research investigates the cutting-edge mathematical and statistical techniques that enable the conversion of data into knowledge. By evaluating their underlying principles, strengths, and limitations, we aim to identify the most promising innovations in data analytics. To demonstrate the practical applications of these innovations, real-world datasets will be utilized through case studies or simulations. This empirical approach will showcase how mathematical and statistical innovations can extract patterns, trends, and insights from complex data, enabling evidence-based decision-making across diverse domains. Furthermore, a comparative analysis will be conducted to assess the performance, scalability, interpretability, and adaptability of different innovations. By benchmarking against established techniques, we aim to validate the effectiveness and superiority of the proposed mathematical and statistical innovations in data analytics. Ethical considerations surrounding data analytics, such as privacy, security, bias, and fairness, will be addressed throughout the research. Guidelines and best practices will be developed to ensure the responsible and ethical use of mathematical and statistical innovations in data analytics. The expected contributions of this research include advancements in mathematical and statistical sciences, improved data analysis techniques, enhanced decision-making processes, and practical implications for industries and policymakers. The outcomes will guide the adoption and implementation of mathematical and statistical innovations, empowering stakeholders to transform data into actionable knowledge and drive meaningful outcomes.

Keywords: data analytics, mathematical innovations, knowledge extraction, decision-making

Procedia PDF Downloads 49
4932 Cognitive Decline in People Living with HIV in India and Correlation with Neurometabolites Using 3T Magnetic Resonance Spectroscopy (MRS): A Cross-Sectional Study

Authors: Kartik Gupta, Virendra Kumar, Sanjeev Sinha, N. Jagannathan

Abstract:

Introduction: A significant number of patients having human immunodeficiency virus (HIV) infection show a neurocognitive decline (NCD) ranging from minor cognitive impairment to severe dementia. The possible causes of NCD in HIV-infected patients include brain injury by HIV before cART, neurotoxic viral proteins and metabolic abnormalities. In the present study, we compared the level of NCD in asymptomatic HIV-infected patients with changes in brain metabolites measured by using magnetic resonance spectroscopy (MRS). Methods: 43 HIV-positive patients (30 males and 13 females) coming to ART center of the hospital and HIV-seronegative healthy subjects were recruited for the study. All the participants completed MRI and MRS examination, detailed clinical assessments and a battery of neuropsychological tests. All the MR investigations were carried out at 3.0T MRI scanner (Ingenia/Achieva, Philips, Netherlands). MRI examination protocol included the acquisition of T2-weighted imaging in axial, coronal and sagittal planes, T1-weighted, FLAIR, and DWI images in the axial plane. Patients who showed any apparent lesion on MRI were excluded from the study. T2-weighted images in three orthogonal planes were used to localize the voxel in left frontal lobe white matter (FWM) and left basal ganglia (BG) for single voxel MRS. Single voxel MRS spectra were acquired with a point resolved spectroscopy (PRESS) localization pulse sequence at an echo time (TE) of 35 ms and a repetition time (TR) of 2000 ms with 64 or 128 scans. Automated preprocessing and determination of absolute concentrations of metabolites were estimated using LCModel by water scaling method and the Cramer-Rao lower bounds for all metabolites analyzed in the study were below 15\%. Levels of total N-acetyl aspartate (tNAA), total choline (tCho), glutamate + glutamine (Glx), total creatine (tCr), were measured. Cognition was tested using a battery of tests validated for Indian population. The cognitive domains tested were the memory, attention-information processing, abstraction-executive, simple and complex perceptual motor skills. Z-scores normalized according to age, sex and education standard were used to calculate dysfunction in these individual domains. The NCD was defined as dysfunction with Z-score ≤ 2 in at least two domains. One-way ANOVA was used to compare the difference in brain metabolites between the patients and healthy subjects. Results: NCD was found in 23 (53%) patients. There was no significant difference in age, CD4 count and viral load between the two groups. Maximum impairment was found in the domains of memory and simple motor skills i.e., 19/43 (44%). The prevalence of deficit in attention-information processing, complex perceptual motor skills and abstraction-executive function was 37%, 35%, 33% respectively. Subjects with NCD had a higher level of Glutamate in the Frontal region (8.03 ± 2.30 v/s. 10.26 ± 5.24, p-value 0.001). Conclusion: Among newly diagnosed, ART-naïve retroviral disease patients from India, cognitive decline was found in 53\% patients using tests validated for this population. Those with neurocognitive decline had a significantly higher level of Glutamate in the left frontal region. There was no significant difference in age, CD4 count and viral load at initiation of ART between the two groups.

Keywords: HIV, neurocognitive decline, neurometabolites, magnetic resonance spectroscopy

Procedia PDF Downloads 174
4931 Information Extraction for Short-Answer Question for the University of the Cordilleras

Authors: Thelma Palaoag, Melanie Basa, Jezreel Mark Panilo

Abstract:

Checking short-answer questions and essays, whether it may be paper or electronic in form, is a tiring and tedious task for teachers. Evaluating a student’s output require wide array of domains. Scoring the work is often a critical task. Several attempts in the past few years to create an automated writing assessment software but only have received negative results from teachers and students alike due to unreliability in scoring, does not provide feedback and others. The study aims to create an application that will be able to check short-answer questions which incorporate information extraction. Information extraction is a subfield of Natural Language Processing (NLP) where a chunk of text (technically known as unstructured text) is being broken down to gather necessary bits of data and/or keywords (structured text) to be further analyzed or rather be utilized by query tools. The proposed system shall be able to extract keywords or phrases from the individual’s answers to match it into a corpora of words (as defined by the instructor), which shall be the basis of evaluation of the individual’s answer. The proposed system shall also enable the teacher to provide feedback and re-evaluate the output of the student for some writing elements in which the computer cannot fully evaluate such as creativity and logic. Teachers can formulate, design, and check short answer questions efficiently by defining keywords or phrases as parameters by assigning weights for checking answers. With the proposed system, teacher’s time in checking and evaluating students output shall be lessened, thus, making the teacher more productive and easier.

Keywords: information extraction, short-answer question, natural language processing, application

Procedia PDF Downloads 408
4930 An Analysis of OpenSim Graphical User Interface Effectiveness

Authors: Sina Saadati

Abstract:

OpenSim is a well-known software in biomechanical studies. There are worthy algorithms developed in this program which are used for modeling and simulation of human motions. In this research, we analyze the OpenSim application from the computer science perspective. It is important that every application have a user-friendly interface. An effective user interface can decrease the time, costs, and energy needed to learn how to use a program. In this paper, we survey the user interface of OpenSim as an important factor of the software. Finally, we infer that there are many challenges to be addressed in the development of OpenSim.

Keywords: biomechanics, computer engineering, graphical user interface, modeling and simulation, interface effectiveness

Procedia PDF Downloads 62
4929 Transcriptional Evidence for the Involvement of MyD88 in Flagellin Recognition: Genomic Identification of Rock Bream MyD88 and Comparative Analysis

Authors: N. Umasuthan, S. D. N. K. Bathige, W. S. Thulasitha, I. Whang, J. Lee

Abstract:

The MyD88 is an evolutionarily conserved host-expressed adaptor protein that is essential for proper TLR/ IL1R immune-response signaling. A previously identified complete cDNA (1626 bp) of OfMyD88 comprised an ORF of 867 bp encoding a protein of 288 amino acids (32.9 kDa). The gDNA (3761 bp) of OfMyD88 revealed a quinquepartite genome organization composed of 5 exons (with the sizes of 310, 132, 178, 92 and 155 bp) separated by 4 introns. All the introns displayed splice signals consistent with the consensus GT/AG rule. A bipartite domain structure with two domains namely death domain (24-103) coded by 1st exon, and TIR domain (151-288) coded by last 3 exons were identified through in silico analysis. Moreover, homology modeling of these two domains revealed a similar quaternary folding nature between human and rock bream homologs. A comprehensive comparison of vertebrate MyD88 genes showed that they possess a 5-exonic structure. In this structure, the last three exons were strongly conserved, and this suggests that a rigid structure has been maintained during vertebrate evolution. A cluster of TATA box-like sequences were found 0.25 kb upstream of cDNA starting position. In addition, putative 5'-flanking region of OfMyD88 was predicted to have TFBS implicated with TLR signaling, including copies of NFB1, APRF/ STAT3, Sp1, IRF1 and 2 and Stat1/2. Using qPCR technique, a ubiquitous mRNA expression was detected in liver and blood. Furthermore, a significantly up-regulated transcriptional expression of OfMyD88 was detected in head kidney (12-24 h; >2-fold), spleen (6 h; 1.5-fold), liver (3 h; 1.9-fold) and intestine (24 h; ~2-fold) post-Fla challenge. These data suggest a crucial role for MyD88 in antibacterial immunity of teleosts.

Keywords: MyD88, innate immunity, flagellin, genomic analysis

Procedia PDF Downloads 391
4928 Comparison of Analytical Method and Software for Analysis of Flat Slab Subjected to Various Parametric Loadings

Authors: Hema V. Vanar, R. K. Soni, N. D. Shah

Abstract:

Slabs supported directly on columns without beams are known as Flat slabs. Flat slabs are highly versatile elements widely used in construction, providing minimum depth, fast construction and allowing flexible column grids. The main objective of this thesis is comparison of analytical method and soft ware for analysis of flat slab subjected to various parametric loadings. Study presents analysis of flat slab is performed under different types of gravity.

Keywords: fat slab, parametric load, analysis, software

Procedia PDF Downloads 457
4927 A Comparative Analysis on QRS Peak Detection Using BIOPAC and MATLAB Software

Authors: Chandra Mukherjee

Abstract:

The present paper is a representation of the work done in the field of ECG signal analysis using MATLAB 7.1 Platform. An accurate and simple ECG feature extraction algorithm is presented in this paper and developed algorithm is validated using BIOPAC software. To detect the QRS peak, ECG signal is processed by following mentioned stages- First Derivative, Second Derivative and then squaring of that second derivative. Efficiency of developed algorithm is tested on ECG samples from different database and real time ECG signals acquired using BIOPAC system. Firstly we have lead wise specified threshold value the samples above that value is marked and in the original signal, where these marked samples face change of slope are spotted as R-peak. On the left and right side of the R-peak, faces change of slope identified as Q and S peak, respectively. Now the inbuilt Detection algorithm of BIOPAC software is performed on same output sample and both outputs are compared. ECG baseline modulation correction is done after detecting characteristics points. The efficiency of the algorithm is tested using some validation parameters like Sensitivity, Positive Predictivity and we got satisfied value of these parameters.

Keywords: first derivative, variable threshold, slope reversal, baseline modulation correction

Procedia PDF Downloads 382
4926 Developing a Toolkit of Undergraduate Nursing Student’ Desirable Characteristics (TNDC) : An application Item Response Theory

Authors: Parinyaporn Thanaboonpuang, Siridej Sujiva, Shotiga Pasiphul

Abstract:

The higher education reform that integration of nursing programmes into the higher education system. Learning outcomes represent one of the essential building blocks for transparency within higher education systems and qualifications. The purpose of this study is to develop a toolkit of undergraduate nursing student’desirable characteristics assessment on Thai Qualifications Framework for Higher education and to test psychometric property for this instrument. This toolkit seeks to improve on the Computer Multimedia test. There are three skills to be examined: Cognitive skill, Responsibility and Interpersonal Skill, and Information Technology Skill. The study was conduct in 4 phases. In Phase 1. Based on developed a measurement model and Computer Multimedia test. Phase 2 two round focus group were conducted, to determine the content validity of measurement model and the toolkit. In Phase 3, data were collected using a multistage random sampling of 1,156 senior undergraduate nursing student were recruited to test psychometric property. In Phase 4 data analysis was conducted by descriptive statistics, item analysis, inter-rater reliability, exploratory factor analysis and confirmatory factor analysis. The resulting TNDC consists of 74 items across the following four domains: Cognitive skill, Interpersonal Skill, Responsibility and Information Technology Skill. The value of Cronbach’ s alpha for the four domains were .781, 807, .831, and .865, respectively. The final model in confirmatory factor analysis fit quite well with empirical data. The TNDC was found to be appropriate, both theoretically and statistically. Due to these results, it is recommended that the toolkit could be used in future studies for Nursing Program in Thailand.

Keywords: toolkit, nursing student’ desirable characteristics, Thai qualifications framework

Procedia PDF Downloads 508
4925 Impact of Experiential Learning on Executive Function, Language Development, and Quality of Life for Adults with Intellectual and Developmental Disabilities (IDD)

Authors: Mary Deyo, Zmara Harrison

Abstract:

This study reports the outcomes of an 8-week experiential learning program for 6 adults with Intellectual and Developmental Disabilities (IDD) at a day habilitation program. The intervention foci for this program include executive function, language learning in the domains of expressive, receptive, and pragmatic language, and quality of life. The interprofessional collaboration aimed at supporting adults with IDD to reach person-centered, functional goals across skill domains is critical. This study is a significant addition to the speech-language pathology literature in that it examines a therapy method that potentially meets this need while targeting domains within the speech-language pathology scope of practice. Communication therapy was provided during highly valued and meaningful hands-on learning experiences, referred to as the Garden Club, which incorporated all aspects of planting and caring for a garden as well as related journaling, sensory, cooking, art, and technology-based activities. Direct care staff and an undergraduate research assistant were trained by SLP to be impactful language guides during their interactions with participants in the Garden Club. SLP also provided direct therapy and modeling during Garden Club. Research methods used in this study included a mixed methods analysis of a literature review, a quasi-experimental implementation of communication therapy in the context of experiential learning activities, Quality of Life participant surveys, quantitative pre- post- data collection and linear mixed model analysis, qualitative data collection with qualitative content analysis and coding for themes. Outcomes indicated overall positive changes in expressive vocabulary, following multi-step directions, sequencing, problem-solving, planning, skills for building and maintaining meaningful social relationships, and participant perception of the Garden Project’s impact on their own quality of life. Implementation of this project also highlighted supports and barriers that must be taken into consideration when planning similar projects. Overall findings support the use of experiential learning projects in day habilitation programs for adults with IDD, as well as additional research to deepen understanding of best practices, supports, and barriers for implementation of experiential learning with this population. This research provides an important contribution to research in the fields of speech-language pathology and other professions serving adults with IDD by describing an interprofessional experiential learning program with positive outcomes for executive function, language learning, and quality of life.

Keywords: experiential learning, adults, intellectual and developmental disabilities, expressive language, receptive language, pragmatic language, executive function, communication therapy, day habilitation, interprofessionalism, quality of life

Procedia PDF Downloads 88
4924 Analysis of Diabetes Patients Using Pearson, Cost Optimization, Control Chart Methods

Authors: Devatha Kalyan Kumar, R. Poovarasan

Abstract:

In this paper, we have taken certain important factors and health parameters of diabetes patients especially among children by birth (pediatric congenital) where using the above three metrics methods we are going to assess the importance of each attributes in the dataset and thereby determining the most highly responsible and co-related attribute causing diabetics among young patients. We use cost optimization, control chart and Spearmen methodologies for the real-time application of finding the data efficiency in this diabetes dataset. The Spearmen methodology is the correlation methodologies used in software development process to identify the complexity between the various modules of the software. Identifying the complexity is important because if the complexity is higher, then there is a higher chance of occurrence of the risk in the software. With the use of control; chart mean, variance and standard deviation of data are calculated. With the use of Cost optimization model, we find to optimize the variables. Hence we choose the Spearmen, control chart and cost optimization methods to assess the data efficiency in diabetes datasets.

Keywords: correlation, congenital diabetics, linear relationship, monotonic function, ranking samples, pediatric

Procedia PDF Downloads 236
4923 Processes and Application of Casting Simulation and Its Software’s

Authors: Surinder Pal, Ajay Gupta, Johny Khajuria

Abstract:

Casting simulation helps visualize mold filling and casting solidification; predict related defects like cold shut, shrinkage porosity and hard spots; and optimize the casting design to achieve the desired quality with high yield. Flow and solidification of molten metals are, however, a very complex phenomenon that is difficult to simulate correctly by conventional computational techniques, especially when the part geometry is intricate and the required inputs (like thermo-physical properties and heat transfer coefficients) are not available. Simulation software is based on the process of modeling a real phenomenon with a set of mathematical formulas. It is, essentially, a program that allows the user to observe an operation through simulation without actually performing that operation. Simulation software is used widely to design equipment so that the final product will be as close to design specs as possible without expensive in process modification. Simulation software with real-time response is often used in gaming, but it also has important industrial applications. When the penalty for improper operation is costly, such as airplane pilots, nuclear power plant operators, or chemical plant operators, a mockup of the actual control panel is connected to a real-time simulation of the physical response, giving valuable training experience without fear of a disastrous outcome. The all casting simulation software has own requirements, like magma cast has only best for crack simulation. The latest generation software Auto CAST developed at IIT Bombay provides a host of functions to support method engineers, including part thickness visualization, core design, multi-cavity mold design with common gating and feeding, application of various feed aids (feeder sleeves, chills, padding, etc.), simulation of mold filling and casting solidification, automatic optimization of feeders and gating driven by the desired quality level, and what-if cost analysis. IIT Bombay has developed a set of applications for the foundry industry to improve casting yield and quality. Casting simulation is a fast and efficient solution for process for advanced tool which is the result of more than 20 years of collaboration with major industrial partners and academic institutions around the world. In this paper the process of casting simulation is studied.

Keywords: casting simulation software’s, simulation technique’s, casting simulation, processes

Procedia PDF Downloads 452
4922 Electrical Equivalent Analysis of Micro Cantilever Beams for Sensing Applications

Authors: B. G. Sheeparamatti, J. S. Kadadevarmath

Abstract:

Microcantilevers are the basic MEMS devices, which can be used as sensors, actuators, and electronics can be easily built into them. The detection principle of microcantilever sensors is based on the measurement of change in cantilever deflection or change in its resonance frequency. The objective of this work is to explore the analogies between the mechanical and electrical equivalent of microcantilever beams. Normally scientists and engineers working in MEMS use expensive software like CoventorWare, IntelliSuite, ANSYS/Multiphysics, etc. This paper indicates the need of developing the electrical equivalent of the MEMS structure and with that, one can have a better insight on important parameters, and their interrelation of the MEMS structure. In this work, considering the mechanical model of the microcantilever, the equivalent electrical circuit is drawn and using a force-voltage analogy, it is analyzed with circuit simulation software. By doing so, one can gain access to a powerful set of intellectual tools that have been developed for understanding electrical circuits. Later the analysis is performed using ANSYS/Multiphysics - software based on finite element method (FEM). It is observed that both mechanical and electrical domain results for a rectangular microcantilevers are in agreement with each other.

Keywords: electrical equivalent circuit analogy, FEM analysis, micro cantilevers, micro sensors

Procedia PDF Downloads 373
4921 Normalized P-Laplacian: From Stochastic Game to Image Processing

Authors: Abderrahim Elmoataz

Abstract:

More and more contemporary applications involve data in the form of functions defined on irregular and topologically complicated domains (images, meshs, points clouds, networks, etc). Such data are not organized as familiar digital signals and images sampled on regular lattices. However, they can be conveniently represented as graphs where each vertex represents measured data and each edge represents a relationship (connectivity or certain affinities or interaction) between two vertices. Processing and analyzing these types of data is a major challenge for both image and machine learning communities. Hence, it is very important to transfer to graphs and networks many of the mathematical tools which were initially developed on usual Euclidean spaces and proven to be efficient for many inverse problems and applications dealing with usual image and signal domains. Historically, the main tools for the study of graphs or networks come from combinatorial and graph theory. In recent years there has been an increasing interest in the investigation of one of the major mathematical tools for signal and image analysis, which are Partial Differential Equations (PDEs) variational methods on graphs. The normalized p-laplacian operator has been recently introduced to model a stochastic game called tug-of-war-game with noise. Part interest of this class of operators arises from the fact that it includes, as particular case, the infinity Laplacian, the mean curvature operator and the traditionnal Laplacian operators which was extensiveley used to models and to solve problems in image processing. The purpose of this paper is to introduce and to study a new class of normalized p-Laplacian on graphs. The introduction is based on the extension of p-harmonious function introduced in as discrete approximation for both infinity Laplacian and p-Laplacian equations. Finally, we propose to use these operators as a framework for solving many inverse problems in image processing.

Keywords: normalized p-laplacian, image processing, stochastic game, inverse problems

Procedia PDF Downloads 488
4920 Calculation of Organ Dose for Adult and Pediatric Patients Undergoing Computed Tomography Examinations: A Software Comparison

Authors: Aya Al Masri, Naima Oubenali, Safoin Aktaou, Thibault Julien, Malorie Martin, Fouad Maaloul

Abstract:

Introduction: The increased number of performed 'Computed Tomography (CT)' examinations raise public concerns regarding associated stochastic risk to patients. In its Publication 102, the ‘International Commission on Radiological Protection (ICRP)’ emphasized the importance of managing patient dose, particularly from repeated or multiple examinations. We developed a Dose Archiving and Communication System that gives multiple dose indexes (organ dose, effective dose, and skin-dose mapping) for patients undergoing radiological imaging exams. The aim of this study is to compare the organ dose values given by our software for patients undergoing CT exams with those of another software named "VirtualDose". Materials and methods: Our software uses Monte Carlo simulations to calculate organ doses for patients undergoing computed tomography examinations. The general calculation principle consists to simulate: (1) the scanner machine with all its technical specifications and associated irradiation cases (kVp, field collimation, mAs, pitch ...) (2) detailed geometric and compositional information of dozens of well identified organs of computational hybrid phantoms that contain the necessary anatomical data. The mass as well as the elemental composition of the tissues and organs that constitute our phantoms correspond to the recommendations of the international organizations (namely the ICRP and the ICRU). Their body dimensions correspond to reference data developed in the United States. Simulated data was verified by clinical measurement. To perform the comparison, 270 adult patients and 150 pediatric patients were used, whose data corresponds to exams carried out in France hospital centers. The comparison dataset of adult patients includes adult males and females for three different scanner machines and three different acquisition protocols (Head, Chest, and Chest-Abdomen-Pelvis). The comparison sample of pediatric patients includes the exams of thirty patients for each of the following age groups: new born, 1-2 years, 3-7 years, 8-12 years, and 13-16 years. The comparison for pediatric patients were performed on the “Head” protocol. The percentage of the dose difference were calculated for organs receiving a significant dose according to the acquisition protocol (80% of the maximal dose). Results: Adult patients: for organs that are completely covered by the scan range, the maximum percentage of dose difference between the two software is 27 %. However, there are three organs situated at the edges of the scan range that show a slightly higher dose difference. Pediatric patients: the percentage of dose difference between the two software does not exceed 30%. These dose differences may be due to the use of two different generations of hybrid phantoms by the two software. Conclusion: This study shows that our software provides a reliable dosimetric information for patients undergoing Computed Tomography exams.

Keywords: adult and pediatric patients, computed tomography, organ dose calculation, software comparison

Procedia PDF Downloads 134
4919 Explore and Reduce the Performance Gap between Building Modelling Simulations and the Real World: Case Study

Authors: B. Salehi, D. Andrews, I. Chaer, A. Gillich, A. Chalk, D. Bush

Abstract:

With the rapid increase of energy consumption in buildings in recent years, especially with the rise in population and growing economies, the importance of energy savings in buildings becomes more critical. One of the key factors in ensuring energy consumption is controlled and kept at a minimum is to utilise building energy modelling at the very early stages of the design. So, building modelling and simulation is a growing discipline. During the design phase of construction, modelling software can be used to estimate a building’s projected energy consumption, as well as building performance. The growth in the use of building modelling software packages opens the door for improvements in the design and also in the modelling itself by introducing novel methods such as building information modelling-based software packages which promote conventional building energy modelling into the digital building design process. To understand the most effective implementation tools, research projects undertaken should include elements of real-world experiments and not just rely on theoretical and simulated approaches. Upon review of the related studies undertaken, it’s evident that they are mostly based on modelling and simulation, which can be due to various reasons such as the more expensive and time-consuming nature of real-time data-based studies. Taking in to account the recent rise of building energy software modelling packages and the increasing number of studies utilising these methods in their projects and research, the accuracy and reliability of these modelling software packages has become even more crucial and critical. This Energy Performance Gap refers to the discrepancy between the predicted energy savings and the realised actual savings, especially after buildings implement energy-efficient technologies. There are many different software packages available which are either free or have commercial versions. In this study, IES VE (Integrated Environmental Solutions Virtual Environment) is used as it is a common Building Energy Modeling and Simulation software in the UK. This paper describes a study that compares real time results with those in a virtual model to illustrate this gap. The subject of the study is a north west facing north-west (345°) facing, naturally ventilated, conservatory within a domestic building in London is monitored during summer to capture real-time data. Then these results are compared to the virtual results of IES VE, which is a commonly used building energy modelling and simulation software in the UK. In this project, the effect of the wrong position of blinds on overheating is studied as well as providing new evidence of Performance Gap. Furthermore, the challenges of drawing the input of solar shading products in IES VE will be considered.

Keywords: building energy modelling and simulation, integrated environmental solutions virtual environment, IES VE, performance gap, real time data, solar shading products

Procedia PDF Downloads 111
4918 Multisignature Schemes for Reinforcing Trust in Cloud Software-As-A-Service Services

Authors: Mustapha Hedabou, Ali Azougaghe, Ahmed Bentajer, Hicham Boukhris, Mourad Eddiwani, Zakaria Igarramen

Abstract:

Software-as-a-service (SaaS) is emerging as a dominant approach to delivering software. It encompasses a range of business, technical opportunities, issue, and challenges. Trustiness in the cloud services regarding the security and the privacy of the delivered data is the most critical issue with the SaaS model. In this paper, we survey the security concerns related to the SaaS model, and we propose the design of a trusted SaaS model that gives users more confidence into SaaS services by leveraging a trust in a neutral source code certifying authority. The proposed design is based on the use of the multisignature mechanism for signing the source code of the application service. In our model, the cloud provider acts as a root of trust by ensuring the integrity of the application service when it was running on its platform. The proposed design prevents insider attacks from tampering with application service before and after it was launched in a cloud provider platform.

Keywords: cloud computing, SaaS Platform, TPM, trustiness, code source certification, multi-signature schemes

Procedia PDF Downloads 250
4917 A Framework for Blockchain Vulnerability Detection and Cybersecurity Education

Authors: Hongmei Chi

Abstract:

The Blockchain has become a necessity for many different societal industries and ordinary lives including cryptocurrency technology, supply chain, health care, public safety, education, etc. Therefore, training our future blockchain developers to know blockchain programming vulnerability and I.T. students' cyber security is in high demand. In this work, we propose a framework including learning modules and hands-on labs to guide future I.T. professionals towards developing secure blockchain programming habits and mitigating source code vulnerabilities at the early stages of the software development lifecycle following the concept of Secure Software Development Life Cycle (SSDLC). In this research, our goal is to make blockchain programmers and I.T. students aware of the vulnerabilities of blockchains. In summary, we develop a framework that will (1) improve students' skills and awareness of blockchain source code vulnerabilities, detection tools, and mitigation techniques (2) integrate concepts of blockchain vulnerabilities for IT students, (3) improve future IT workers’ ability to master the concepts of blockchain attacks.

Keywords: software vulnerability detection, hands-on lab, static analysis tools, vulnerabilities, blockchain, active learning

Procedia PDF Downloads 60
4916 Module Valuations and Quasi-Valuations

Authors: Shai Sarussi

Abstract:

Suppose F is a field with valuation v and valuation domain Oᵥ, and R is an Oᵥ-algebra. It is known that there exists a filter quasi-valuation on R; the existence of a quasi-valuation yields several important connections between Oᵥ and R, in particular with respect to their prime spectra. In this paper, the notion of a module valuation is introduced. It is shown that any torsion-free module over Oᵥ has an induced module valuation. Moreover, several results connecting the filter quasi-valuation and module valuations are presented.

Keywords: valuations, quasi-valuations, prime spectrum, algebras over valuation domains

Procedia PDF Downloads 200
4915 A Robust Implementation of a Building Resources Access Rights Management System

Authors: Eugen Neagoe, Victor Balanica

Abstract:

A Smart Building Controller (SBC) is a server software that offers secured access to a pool of building specific resources, executes monitoring tasks and performs automatic administration of a building, thus optimizing the exploitation cost and maximizing comfort. This paper brings to discussion the issues that arise with the secure exploitation of the SBC administered resources and proposes a technical solution to implement a robust secure access system based on roles, individual rights and privileges (special rights).

Keywords: smart building controller, software security, access rights, access authorization

Procedia PDF Downloads 419
4914 The Challenges of Scaling Agile to Large-Scale Distributed Development: An Overview of the Agile Factory Model

Authors: Bernard Doherty, Andrew Jelfs, Aveek Dasgupta, Patrick Holden

Abstract:

Many companies have moved to agile and hybrid agile methodologies where portions of the Software Design Life-cycle (SDLC) and Software Test Life-cycle (STLC) can be time boxed in order to enhance delivery speed, quality and to increase flexibility to changes in software requirements. Despite widespread proliferation of agile practices, implementation often fails due to lack of adequate project management support, decreased motivation or fear of increased interaction. Consequently, few organizations effectively adopt agile processes with tailoring often required to integrate agile methodology in large scale environments. This paper provides an overview of the challenges in implementing an innovative large-scale tailored realization of the agile methodology termed the Agile Factory Model (AFM), with the aim of comparing and contrasting issues of specific importance to organizations undertaking large scale agile development. The conclusions demonstrate that agile practices can be effectively translated to a globally distributed development environment.

Keywords: agile, agile factory model, globally distributed development, large-scale agile

Procedia PDF Downloads 270
4913 Developing Rice Disease Analysis System on Mobile via iOS Operating System

Authors: Rujijan Vichivanives, Kittiya Poonsilp, Canasanan Wanavijit

Abstract:

This research aims to create mobile tools to analyze rice disease quickly and easily. The principle of object-oriented software engineering and objective-C language were used for software development methodology and the principle of decision tree technique was used for analysis method. Application users can select the features of rice disease or the color appears on the rice leaves for recognition analysis results on iOS mobile screen. After completing the software development, unit testing and integrating testing method were used to check for program validity. In addition, three plant experts and forty farmers have been assessed for usability and benefit of this system. The overall of users’ satisfaction was found in a good level, 57%. The plant experts give a comment on the addition of various disease symptoms in the database for more precise results of the analysis. For further research, it is suggested that image processing system should be developed as a tool that allows users search and analyze for rice diseases more convenient with great accuracy.

Keywords: rice disease, data analysis system, mobile application, iOS operating system

Procedia PDF Downloads 267
4912 Quality of Life among Female Sex Workers of Selected Organization of Pokhara: A Methodological Triangulation

Authors: Sharmila Dahal Paudel

Abstract:

Background: There are around twenty-four thousand to twenty-eight thousand Female Sex Workers in Nepal. FSWs are the vulnerable groups for sexually transmitted infections (STIs) and human immunodeficiency virus (HIV) infections which directly and indirectly ease to reduce the quality of life of such groups. Due to their highly marginalized status, FSWs in Nepal have limited access to information about reproductive health and safe sex practices. The objectives of the study are to assess the quality of life of female sex workers and the factors affecting them. Materials and Methods: A descriptive cross-sectional study with methodological triangulation was conducted among 108 FSWs on the basis of service record of selected organization of Pokhara valley. The complete enumerative sampling was used to select FSWs. Structured interview schedule, WHOQOL-BREF and in-depth questionnaire were used to collect the data. The descriptive and inferential statistics were used to interpret the result. Results: The mean age of participants were 23.44 years and the mean quality of life score was 174.06 ranging from 56.54 to 370.78. Among the domain scores, the mean score is highest in social domain (55.89) followed by physical (45.42), psychological (39.27) and the environmental (34.23). Regarding the association of QOL with socio-demographic, occupation and health-related variables, the multi-linear regression suggests that the satisfaction with occupation was highly significant with the total QOL score (B=-50.50, SE=10.46; p= <0.001) and there is negative relation between QOL and feeling of exploitation and facing STI problems. This means those who feels exploited have significantly less QOL comparing with those who did not feel the same. In correlation analysis, all the domains are positively co-related with each domain which is found to be significant at 1% level of significance. Conclusion: The highest mean score was in social domain, and the lowest is in environmental domain which suggests that the items included in environmental domains could not be utilized or hindrance were there.

Keywords: FSWs, HIV, QOL, WHOQOL-BREF

Procedia PDF Downloads 146
4911 Agile Software Effort Estimation Using Regression Techniques

Authors: Mikiyas Adugna

Abstract:

Effort estimation is among the activities carried out in software development processes. An accurate model of estimation leads to project success. The method of agile effort estimation is a complex task because of the dynamic nature of software development. Researchers are still conducting studies on agile effort estimation to enhance prediction accuracy. Due to these reasons, we investigated and proposed a model on LASSO and Elastic Net regression to enhance estimation accuracy. The proposed model has major components: preprocessing, train-test split, training with default parameters, and cross-validation. During the preprocessing phase, the entire dataset is normalized. After normalization, a train-test split is performed on the dataset, setting training at 80% and testing set to 20%. We chose two different phases for training the two algorithms (Elastic Net and LASSO) regression following the train-test-split. In the first phase, the two algorithms are trained using their default parameters and evaluated on the testing data. In the second phase, the grid search technique (the grid is used to search for tuning and select optimum parameters) and 5-fold cross-validation to get the final trained model. Finally, the final trained model is evaluated using the testing set. The experimental work is applied to the agile story point dataset of 21 software projects collected from six firms. The results show that both Elastic Net and LASSO regression outperformed the compared ones. Compared to the proposed algorithms, LASSO regression achieved better predictive performance and has acquired PRED (8%) and PRED (25%) results of 100.0, MMRE of 0.0491, MMER of 0.0551, MdMRE of 0.0593, MdMER of 0.063, and MSE of 0.0007. The result implies LASSO regression algorithm trained model is the most acceptable, and higher estimation performance exists in the literature.

Keywords: agile software development, effort estimation, elastic net regression, LASSO

Procedia PDF Downloads 31
4910 A Survey on Requirements and Challenges of Internet Protocol Television Service over Software Defined Networking

Authors: Esmeralda Hysenbelliu

Abstract:

Over the last years, the demand for high bandwidth services, such as live (IPTV Service) and on-demand video streaming, steadily and rapidly increased. It has been predicted that video traffic (IPTV, VoD, and WEB TV) will account more than 90% of global Internet Protocol traffic that will cross the globe in 2016. Consequently, the importance and consideration on requirements and challenges of service providers faced today in supporting user’s requests for entertainment video across the various IPTV services through virtualization over Software Defined Networks (SDN), is tremendous in the highest stage of attention. What is necessarily required, is to deliver optimized live and on-demand services like Internet Protocol Service (IPTV Service) with low cost and good quality by strictly fulfill the essential requirements of Clients and ISP’s (Internet Service Provider’s) in the same time. The aim of this study is to present an overview of the important requirements and challenges of IPTV service with two network trends on solving challenges through virtualization (SDN and Network Function Virtualization). This paper provides an overview of researches published in the last five years.

Keywords: challenges, IPTV service, requirements, software defined networking (SDN)

Procedia PDF Downloads 247
4909 Selecting Graduates for the Interns’ Award by Using Multisource Feedback Process: Does It Work?

Authors: Kathryn Strachan, Sameer Otoom, Amal AL-Gallaf, Ahmed Al Ansari

Abstract:

Introduction: Introducing a reliable method to select graduates for an award in higher education can be challenging but is not impossible. Multisource feedback (MSF) is a popular assessment tool that relies on evaluations of different groups of people, including physicians and non-physicians. It is useful for assessing several domains, including professionalism, communication and collaboration and may be useful for selecting the best interns to receive a University award. Methods: 16 graduates responded to an invitation to participate in the student award, which was conducted by the Royal College of Surgeons of Ireland-Bahrain Medical University of Bahrain (RCSI Bahrain) using the MSF process. Five individuals from the following categories rated each participant: physicians, nurses, and fellow students. RCSI Bahrain graduates were assessed in the following domains; professionalism, communication, and collaboration. Mean and standard deviation were calculated and the award was given to the graduate who scored the highest among his/her colleagues. Cronbach’s coefficient was used to determine the questionnaire’s internal consistency and reliability. Factor analysis was conducted to examine for the construct validity. Results: 16 graduates participated in the RCSI-Bahrain interns’ award based on the MSF process, giving us a 16.5% response rate. The instrument was found to be suitable for factor analysis and showed 3 factor solutions representing 79.3% of the total variance. Reliability analysis using Cronbach’s α reliability of internal consistency indicated that the full scale of the instrument had high internal consistency (Cronbach’s α 0.98). Conclusion: This study found the MSF process to be reliable and valid for selecting the best graduates for the interns’ awards. However, the low response rates may suggest that the process is not feasible for allowing the majority of the students to participate in the selection process. Further research studies may be required to support the feasibility of the MSF process in selecting graduates for the university award.

Keywords: MSF, RCSI, validity, Bahrain

Procedia PDF Downloads 317
4908 AFM Probe Sensor Designed for Cellular Membrane Components

Authors: Sarmiza Stanca, Wolfgang Fritzsche, Christoph Krafft, Jürgen Popp

Abstract:

Independent of the cell type a thin layer of a few nanometers thickness surrounds the cell interior as the cellular membrane. The transport of ions and molecules through the membrane is achieved in a very precise way by pores. Understanding the process of opening and closing the pores due to an electrochemical gradient across the membrane requires knowledge of the pore constitutive proteins. Recent reports prove the access to the molecular level of the cellular membrane by atomic force microscopy (AFM). This technique also permits an electrochemical study in the immediate vicinity of the tip. Specific molecules can be electrochemically localized in the natural cellular membrane. Our work aims to recognize the protein domains of the pores using an AFM probe as a miniaturized amperometric sensor, and to follow the protein behavior while changing the applied potential. The intensity of the current produced between the surface and the AFM probe is amplified and detected simultaneously with the surface imaging. The AFM probe plays the role of the working electrode and the substrate, a conductive glass on which the cells are grown, represent the counter electrode. For a better control of the electric potential on the probe, a third electrode Ag/AgCl wire is mounted in the circuit as a reference electrode. The working potential is applied between the electrodes with a programmable source and the current intensity in the circuit is recorded with a multimeter. The applied potential considers the overpotential at the electrode surface and the potential drop due to the current flow through the system. The reported method permits a high resolved electrochemical study of the protein domains on the living cell membrane. The amperometric map identifies areas of different current intensities on the pore depending on the applied potential. The reproducibility of this method is limited by the tip shape, the uncontrollable capacitance, which occurs at the apex and a potential local charge separation.

Keywords: AFM, sensor, membrane, pores, proteins

Procedia PDF Downloads 289
4907 Numerical Static and Seismic Evaluation of Pile Group Settlement: A Case Study

Authors: Seyed Abolhassan Naeini, Hamed Yekehdehghan

Abstract:

Shallow foundations cannot be used when the bedding soil is soft. A suitable method for constructing foundations on soft soil is to employ pile groups to transfer the load to the bottom layers. The present research used results from tests carried out in northern Iran (Langarud) and the FLAC3D software to model a pile group for investigating the effects of various parameters on pile cap settlement under static and seismic conditions. According to the results, changes in the strength parameters of the soil, groundwater level, and the length of and distance between the piles affect settlement differently.

Keywords: FLACD 3D software, pile group, settlement, soil

Procedia PDF Downloads 100
4906 Cyber Security Enhancement via Software Defined Pseudo-Random Private IP Address Hopping

Authors: Andre Slonopas, Zona Kostic, Warren Thompson

Abstract:

Obfuscation is one of the most useful tools to prevent network compromise. Previous research focused on the obfuscation of the network communications between external-facing edge devices. This work proposes the use of two edge devices, external and internal facing, which communicate via private IPv4 addresses in a software-defined pseudo-random IP hopping. This methodology does not require additional IP addresses and/or resources to implement. Statistical analyses demonstrate that the hopping surface must be at least 1e3 IP addresses in size with a broad standard deviation to minimize the possibility of coincidence of monitored and communication IPs. The probability of breaking the hopping algorithm requires a collection of at least 1e6 samples, which for large hopping surfaces will take years to collect. The probability of dropped packets is controlled via memory buffers and the frequency of hops and can be reduced to levels acceptable for video streaming. This methodology provides an impenetrable layer of security ideal for information and supervisory control and data acquisition systems.

Keywords: moving target defense, cybersecurity, network security, hopping randomization, software defined network, network security theory

Procedia PDF Downloads 163
4905 Numerical Simulation of Multiple Arrays Arrangement of Micro Hydro Power Turbines

Authors: M. A. At-Tasneem, N. T. Rao, T. M. Y. S. Tuan Ya, M. S. Idris, M. Ammar

Abstract:

River flow over micro hydro power (MHP) turbines of multiple arrays arrangement is simulated with computational fluid dynamics (CFD) software to obtain the flow characteristics. In this paper, CFD software is used to simulate the water flow over MHP turbines as they are placed in a river. Multiple arrays arrangement of MHP turbines lead to generate large amount of power. In this study, a river model is created and simulated in CFD software to obtain the water flow characteristic. The process then continued by simulating different types of arrays arrangement in the river model. A MHP turbine model consists of a turbine outer body and static propeller blade in it. Five types of arrangements are used which are parallel, series, triangular, square and rhombus with different spacing sizes. The velocity profiles on each MHP turbines are identified at the mouth of each turbine bodies. This study is required to obtain the arrangement with increasing spacing sizes that can produce highest power density through the water flow variation.

Keywords: micro hydro power, CFD, arrays arrangement, spacing sizes, velocity profile, power

Procedia PDF Downloads 332
4904 An Extensible Software Infrastructure for Computer Aided Custom Monitoring of Patients in Smart Homes

Authors: Ritwik Dutta, Marylin Wolf

Abstract:

This paper describes the trade-offs and the design from scratch of a self-contained, easy-to-use health dashboard software system that provides customizable data tracking for patients in smart homes. The system is made up of different software modules and comprises a front-end and a back-end component. Built with HTML, CSS, and JavaScript, the front-end allows adding users, logging into the system, selecting metrics, and specifying health goals. The back-end consists of a NoSQL Mongo database, a Python script, and a SimpleHTTPServer written in Python. The database stores user profiles and health data in JSON format. The Python script makes use of the PyMongo driver library to query the database and displays formatted data as a daily snapshot of user health metrics against target goals. Any number of standard and custom metrics can be added to the system, and corresponding health data can be fed automatically, via sensor APIs or manually, as text or picture data files. A real-time METAR request API permits correlating weather data with patient health, and an advanced query system is implemented to allow trend analysis of selected health metrics over custom time intervals. Available on the GitHub repository system, the project is free to use for academic purposes of learning and experimenting, or practical purposes by building on it.

Keywords: flask, Java, JavaScript, health monitoring, long-term care, Mongo, Python, smart home, software engineering, webserver

Procedia PDF Downloads 365