Search results for: secure software
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5294

Search results for: secure software

4784 The Relationship between Mothers’ Attachment Style, Mindful Parenting and Perception of the Child

Authors: Brigitta Szabo, Miklosi Monika

Abstract:

Background/Aims: In early childhood, the context of development is the caregiver-child relationship. Maternal attachment style plays a major role in the intergenerational transmission of psychopathology. The aim of this study was to explore the relationship between the mothers’ attachment style, mindful parenting, and perception of the child. Method: Data was collected from 144 non-clinical mothers who have a child below the age of 3 years. Mothers completed self-report questionnaires, including the following scales: a demographic questionnaire, Attachment Style Questionnaire (ASQ), Interpersonal Mindfulness in Parenting Scale (IMP), and the Mothers’ Object Relations Scale (MORS-SF). K-means cluster analysis was used to identify the mothers’ attachment styles. Mediation analyses with Mothers’ Object Relations Scale (MORS-SF) positive emotions and dominance subscales as dependent variables, mothers’ attachment style (ASQ) as an independent variable, and mindful parenting (IMP) as a mediator were conducted. Results: Four attachment styles (secure, preoccupied, fearful, dismissing) were identified. The relationship between mothers’ attachment style and mindful parenting was significant (R2 = .51; F(4,139) = 36.60; p < .001). Compared to the secure attachment style as a reference group, both preoccupied and dismissing styles were related to lower levels of mindful parenting; however, this relationship was the strongest in case of fearful style. In mediation analysis the direct effects of mothers’ attachment style on the perception of the child were not significant (MORS positive emotions: R2= .29; F(5,138) = 11.22; p < .001; MORS dominance: R2= .39 F(5,138) = 17.54, p < .001). However, indirect effects through mindful parenting were significant; higher levels of mindful parenting were associated with higher levels of MORS positive emotions and lower levels of MORS dominance. Conclusions: These findings suggest that attachment styles are related to the perception of the child through mindful parenting. Mindfulness-based parenting training might be useful in case of attachment-related problems to improve the parent-child relationship.

Keywords: mindfulness, mindful parenting, attachement, perception

Procedia PDF Downloads 178
4783 An Analysis of OpenSim Graphical User Interface Effectiveness

Authors: Sina Saadati

Abstract:

OpenSim is a well-known software in biomechanical studies. There are worthy algorithms developed in this program which are used for modeling and simulation of human motions. In this research, we analyze the OpenSim application from the computer science perspective. It is important that every application have a user-friendly interface. An effective user interface can decrease the time, costs, and energy needed to learn how to use a program. In this paper, we survey the user interface of OpenSim as an important factor of the software. Finally, we infer that there are many challenges to be addressed in the development of OpenSim.

Keywords: biomechanics, computer engineering, graphical user interface, modeling and simulation, interface effectiveness

Procedia PDF Downloads 62
4782 Comparison of Analytical Method and Software for Analysis of Flat Slab Subjected to Various Parametric Loadings

Authors: Hema V. Vanar, R. K. Soni, N. D. Shah

Abstract:

Slabs supported directly on columns without beams are known as Flat slabs. Flat slabs are highly versatile elements widely used in construction, providing minimum depth, fast construction and allowing flexible column grids. The main objective of this thesis is comparison of analytical method and soft ware for analysis of flat slab subjected to various parametric loadings. Study presents analysis of flat slab is performed under different types of gravity.

Keywords: fat slab, parametric load, analysis, software

Procedia PDF Downloads 457
4781 A Comparative Analysis on QRS Peak Detection Using BIOPAC and MATLAB Software

Authors: Chandra Mukherjee

Abstract:

The present paper is a representation of the work done in the field of ECG signal analysis using MATLAB 7.1 Platform. An accurate and simple ECG feature extraction algorithm is presented in this paper and developed algorithm is validated using BIOPAC software. To detect the QRS peak, ECG signal is processed by following mentioned stages- First Derivative, Second Derivative and then squaring of that second derivative. Efficiency of developed algorithm is tested on ECG samples from different database and real time ECG signals acquired using BIOPAC system. Firstly we have lead wise specified threshold value the samples above that value is marked and in the original signal, where these marked samples face change of slope are spotted as R-peak. On the left and right side of the R-peak, faces change of slope identified as Q and S peak, respectively. Now the inbuilt Detection algorithm of BIOPAC software is performed on same output sample and both outputs are compared. ECG baseline modulation correction is done after detecting characteristics points. The efficiency of the algorithm is tested using some validation parameters like Sensitivity, Positive Predictivity and we got satisfied value of these parameters.

Keywords: first derivative, variable threshold, slope reversal, baseline modulation correction

Procedia PDF Downloads 382
4780 Analysis of Diabetes Patients Using Pearson, Cost Optimization, Control Chart Methods

Authors: Devatha Kalyan Kumar, R. Poovarasan

Abstract:

In this paper, we have taken certain important factors and health parameters of diabetes patients especially among children by birth (pediatric congenital) where using the above three metrics methods we are going to assess the importance of each attributes in the dataset and thereby determining the most highly responsible and co-related attribute causing diabetics among young patients. We use cost optimization, control chart and Spearmen methodologies for the real-time application of finding the data efficiency in this diabetes dataset. The Spearmen methodology is the correlation methodologies used in software development process to identify the complexity between the various modules of the software. Identifying the complexity is important because if the complexity is higher, then there is a higher chance of occurrence of the risk in the software. With the use of control; chart mean, variance and standard deviation of data are calculated. With the use of Cost optimization model, we find to optimize the variables. Hence we choose the Spearmen, control chart and cost optimization methods to assess the data efficiency in diabetes datasets.

Keywords: correlation, congenital diabetics, linear relationship, monotonic function, ranking samples, pediatric

Procedia PDF Downloads 236
4779 Processes and Application of Casting Simulation and Its Software’s

Authors: Surinder Pal, Ajay Gupta, Johny Khajuria

Abstract:

Casting simulation helps visualize mold filling and casting solidification; predict related defects like cold shut, shrinkage porosity and hard spots; and optimize the casting design to achieve the desired quality with high yield. Flow and solidification of molten metals are, however, a very complex phenomenon that is difficult to simulate correctly by conventional computational techniques, especially when the part geometry is intricate and the required inputs (like thermo-physical properties and heat transfer coefficients) are not available. Simulation software is based on the process of modeling a real phenomenon with a set of mathematical formulas. It is, essentially, a program that allows the user to observe an operation through simulation without actually performing that operation. Simulation software is used widely to design equipment so that the final product will be as close to design specs as possible without expensive in process modification. Simulation software with real-time response is often used in gaming, but it also has important industrial applications. When the penalty for improper operation is costly, such as airplane pilots, nuclear power plant operators, or chemical plant operators, a mockup of the actual control panel is connected to a real-time simulation of the physical response, giving valuable training experience without fear of a disastrous outcome. The all casting simulation software has own requirements, like magma cast has only best for crack simulation. The latest generation software Auto CAST developed at IIT Bombay provides a host of functions to support method engineers, including part thickness visualization, core design, multi-cavity mold design with common gating and feeding, application of various feed aids (feeder sleeves, chills, padding, etc.), simulation of mold filling and casting solidification, automatic optimization of feeders and gating driven by the desired quality level, and what-if cost analysis. IIT Bombay has developed a set of applications for the foundry industry to improve casting yield and quality. Casting simulation is a fast and efficient solution for process for advanced tool which is the result of more than 20 years of collaboration with major industrial partners and academic institutions around the world. In this paper the process of casting simulation is studied.

Keywords: casting simulation software’s, simulation technique’s, casting simulation, processes

Procedia PDF Downloads 452
4778 Electrical Equivalent Analysis of Micro Cantilever Beams for Sensing Applications

Authors: B. G. Sheeparamatti, J. S. Kadadevarmath

Abstract:

Microcantilevers are the basic MEMS devices, which can be used as sensors, actuators, and electronics can be easily built into them. The detection principle of microcantilever sensors is based on the measurement of change in cantilever deflection or change in its resonance frequency. The objective of this work is to explore the analogies between the mechanical and electrical equivalent of microcantilever beams. Normally scientists and engineers working in MEMS use expensive software like CoventorWare, IntelliSuite, ANSYS/Multiphysics, etc. This paper indicates the need of developing the electrical equivalent of the MEMS structure and with that, one can have a better insight on important parameters, and their interrelation of the MEMS structure. In this work, considering the mechanical model of the microcantilever, the equivalent electrical circuit is drawn and using a force-voltage analogy, it is analyzed with circuit simulation software. By doing so, one can gain access to a powerful set of intellectual tools that have been developed for understanding electrical circuits. Later the analysis is performed using ANSYS/Multiphysics - software based on finite element method (FEM). It is observed that both mechanical and electrical domain results for a rectangular microcantilevers are in agreement with each other.

Keywords: electrical equivalent circuit analogy, FEM analysis, micro cantilevers, micro sensors

Procedia PDF Downloads 373
4777 Calculation of Organ Dose for Adult and Pediatric Patients Undergoing Computed Tomography Examinations: A Software Comparison

Authors: Aya Al Masri, Naima Oubenali, Safoin Aktaou, Thibault Julien, Malorie Martin, Fouad Maaloul

Abstract:

Introduction: The increased number of performed 'Computed Tomography (CT)' examinations raise public concerns regarding associated stochastic risk to patients. In its Publication 102, the ‘International Commission on Radiological Protection (ICRP)’ emphasized the importance of managing patient dose, particularly from repeated or multiple examinations. We developed a Dose Archiving and Communication System that gives multiple dose indexes (organ dose, effective dose, and skin-dose mapping) for patients undergoing radiological imaging exams. The aim of this study is to compare the organ dose values given by our software for patients undergoing CT exams with those of another software named "VirtualDose". Materials and methods: Our software uses Monte Carlo simulations to calculate organ doses for patients undergoing computed tomography examinations. The general calculation principle consists to simulate: (1) the scanner machine with all its technical specifications and associated irradiation cases (kVp, field collimation, mAs, pitch ...) (2) detailed geometric and compositional information of dozens of well identified organs of computational hybrid phantoms that contain the necessary anatomical data. The mass as well as the elemental composition of the tissues and organs that constitute our phantoms correspond to the recommendations of the international organizations (namely the ICRP and the ICRU). Their body dimensions correspond to reference data developed in the United States. Simulated data was verified by clinical measurement. To perform the comparison, 270 adult patients and 150 pediatric patients were used, whose data corresponds to exams carried out in France hospital centers. The comparison dataset of adult patients includes adult males and females for three different scanner machines and three different acquisition protocols (Head, Chest, and Chest-Abdomen-Pelvis). The comparison sample of pediatric patients includes the exams of thirty patients for each of the following age groups: new born, 1-2 years, 3-7 years, 8-12 years, and 13-16 years. The comparison for pediatric patients were performed on the “Head” protocol. The percentage of the dose difference were calculated for organs receiving a significant dose according to the acquisition protocol (80% of the maximal dose). Results: Adult patients: for organs that are completely covered by the scan range, the maximum percentage of dose difference between the two software is 27 %. However, there are three organs situated at the edges of the scan range that show a slightly higher dose difference. Pediatric patients: the percentage of dose difference between the two software does not exceed 30%. These dose differences may be due to the use of two different generations of hybrid phantoms by the two software. Conclusion: This study shows that our software provides a reliable dosimetric information for patients undergoing Computed Tomography exams.

Keywords: adult and pediatric patients, computed tomography, organ dose calculation, software comparison

Procedia PDF Downloads 134
4776 Explore and Reduce the Performance Gap between Building Modelling Simulations and the Real World: Case Study

Authors: B. Salehi, D. Andrews, I. Chaer, A. Gillich, A. Chalk, D. Bush

Abstract:

With the rapid increase of energy consumption in buildings in recent years, especially with the rise in population and growing economies, the importance of energy savings in buildings becomes more critical. One of the key factors in ensuring energy consumption is controlled and kept at a minimum is to utilise building energy modelling at the very early stages of the design. So, building modelling and simulation is a growing discipline. During the design phase of construction, modelling software can be used to estimate a building’s projected energy consumption, as well as building performance. The growth in the use of building modelling software packages opens the door for improvements in the design and also in the modelling itself by introducing novel methods such as building information modelling-based software packages which promote conventional building energy modelling into the digital building design process. To understand the most effective implementation tools, research projects undertaken should include elements of real-world experiments and not just rely on theoretical and simulated approaches. Upon review of the related studies undertaken, it’s evident that they are mostly based on modelling and simulation, which can be due to various reasons such as the more expensive and time-consuming nature of real-time data-based studies. Taking in to account the recent rise of building energy software modelling packages and the increasing number of studies utilising these methods in their projects and research, the accuracy and reliability of these modelling software packages has become even more crucial and critical. This Energy Performance Gap refers to the discrepancy between the predicted energy savings and the realised actual savings, especially after buildings implement energy-efficient technologies. There are many different software packages available which are either free or have commercial versions. In this study, IES VE (Integrated Environmental Solutions Virtual Environment) is used as it is a common Building Energy Modeling and Simulation software in the UK. This paper describes a study that compares real time results with those in a virtual model to illustrate this gap. The subject of the study is a north west facing north-west (345°) facing, naturally ventilated, conservatory within a domestic building in London is monitored during summer to capture real-time data. Then these results are compared to the virtual results of IES VE, which is a commonly used building energy modelling and simulation software in the UK. In this project, the effect of the wrong position of blinds on overheating is studied as well as providing new evidence of Performance Gap. Furthermore, the challenges of drawing the input of solar shading products in IES VE will be considered.

Keywords: building energy modelling and simulation, integrated environmental solutions virtual environment, IES VE, performance gap, real time data, solar shading products

Procedia PDF Downloads 111
4775 Multisignature Schemes for Reinforcing Trust in Cloud Software-As-A-Service Services

Authors: Mustapha Hedabou, Ali Azougaghe, Ahmed Bentajer, Hicham Boukhris, Mourad Eddiwani, Zakaria Igarramen

Abstract:

Software-as-a-service (SaaS) is emerging as a dominant approach to delivering software. It encompasses a range of business, technical opportunities, issue, and challenges. Trustiness in the cloud services regarding the security and the privacy of the delivered data is the most critical issue with the SaaS model. In this paper, we survey the security concerns related to the SaaS model, and we propose the design of a trusted SaaS model that gives users more confidence into SaaS services by leveraging a trust in a neutral source code certifying authority. The proposed design is based on the use of the multisignature mechanism for signing the source code of the application service. In our model, the cloud provider acts as a root of trust by ensuring the integrity of the application service when it was running on its platform. The proposed design prevents insider attacks from tampering with application service before and after it was launched in a cloud provider platform.

Keywords: cloud computing, SaaS Platform, TPM, trustiness, code source certification, multi-signature schemes

Procedia PDF Downloads 250
4774 The Challenges of Scaling Agile to Large-Scale Distributed Development: An Overview of the Agile Factory Model

Authors: Bernard Doherty, Andrew Jelfs, Aveek Dasgupta, Patrick Holden

Abstract:

Many companies have moved to agile and hybrid agile methodologies where portions of the Software Design Life-cycle (SDLC) and Software Test Life-cycle (STLC) can be time boxed in order to enhance delivery speed, quality and to increase flexibility to changes in software requirements. Despite widespread proliferation of agile practices, implementation often fails due to lack of adequate project management support, decreased motivation or fear of increased interaction. Consequently, few organizations effectively adopt agile processes with tailoring often required to integrate agile methodology in large scale environments. This paper provides an overview of the challenges in implementing an innovative large-scale tailored realization of the agile methodology termed the Agile Factory Model (AFM), with the aim of comparing and contrasting issues of specific importance to organizations undertaking large scale agile development. The conclusions demonstrate that agile practices can be effectively translated to a globally distributed development environment.

Keywords: agile, agile factory model, globally distributed development, large-scale agile

Procedia PDF Downloads 270
4773 Developing Rice Disease Analysis System on Mobile via iOS Operating System

Authors: Rujijan Vichivanives, Kittiya Poonsilp, Canasanan Wanavijit

Abstract:

This research aims to create mobile tools to analyze rice disease quickly and easily. The principle of object-oriented software engineering and objective-C language were used for software development methodology and the principle of decision tree technique was used for analysis method. Application users can select the features of rice disease or the color appears on the rice leaves for recognition analysis results on iOS mobile screen. After completing the software development, unit testing and integrating testing method were used to check for program validity. In addition, three plant experts and forty farmers have been assessed for usability and benefit of this system. The overall of users’ satisfaction was found in a good level, 57%. The plant experts give a comment on the addition of various disease symptoms in the database for more precise results of the analysis. For further research, it is suggested that image processing system should be developed as a tool that allows users search and analyze for rice diseases more convenient with great accuracy.

Keywords: rice disease, data analysis system, mobile application, iOS operating system

Procedia PDF Downloads 267
4772 Agile Software Effort Estimation Using Regression Techniques

Authors: Mikiyas Adugna

Abstract:

Effort estimation is among the activities carried out in software development processes. An accurate model of estimation leads to project success. The method of agile effort estimation is a complex task because of the dynamic nature of software development. Researchers are still conducting studies on agile effort estimation to enhance prediction accuracy. Due to these reasons, we investigated and proposed a model on LASSO and Elastic Net regression to enhance estimation accuracy. The proposed model has major components: preprocessing, train-test split, training with default parameters, and cross-validation. During the preprocessing phase, the entire dataset is normalized. After normalization, a train-test split is performed on the dataset, setting training at 80% and testing set to 20%. We chose two different phases for training the two algorithms (Elastic Net and LASSO) regression following the train-test-split. In the first phase, the two algorithms are trained using their default parameters and evaluated on the testing data. In the second phase, the grid search technique (the grid is used to search for tuning and select optimum parameters) and 5-fold cross-validation to get the final trained model. Finally, the final trained model is evaluated using the testing set. The experimental work is applied to the agile story point dataset of 21 software projects collected from six firms. The results show that both Elastic Net and LASSO regression outperformed the compared ones. Compared to the proposed algorithms, LASSO regression achieved better predictive performance and has acquired PRED (8%) and PRED (25%) results of 100.0, MMRE of 0.0491, MMER of 0.0551, MdMRE of 0.0593, MdMER of 0.063, and MSE of 0.0007. The result implies LASSO regression algorithm trained model is the most acceptable, and higher estimation performance exists in the literature.

Keywords: agile software development, effort estimation, elastic net regression, LASSO

Procedia PDF Downloads 31
4771 An Investigation Enhancing E-Voting Application Performance

Authors: Aditya Verma

Abstract:

E-voting using blockchain provides us with a distributed system where data is present on each node present in the network and is reliable and secure too due to its immutability property. This work compares various blockchain consensus algorithms used for e-voting applications in the past, based on performance and node scalability, and chooses the optimal one and improves on one such previous implementation by proposing solutions for the loopholes of the optimally working blockchain consensus algorithm, in our chosen application, e-voting.

Keywords: blockchain, parallel bft, consensus algorithms, performance

Procedia PDF Downloads 139
4770 A Survey on Requirements and Challenges of Internet Protocol Television Service over Software Defined Networking

Authors: Esmeralda Hysenbelliu

Abstract:

Over the last years, the demand for high bandwidth services, such as live (IPTV Service) and on-demand video streaming, steadily and rapidly increased. It has been predicted that video traffic (IPTV, VoD, and WEB TV) will account more than 90% of global Internet Protocol traffic that will cross the globe in 2016. Consequently, the importance and consideration on requirements and challenges of service providers faced today in supporting user’s requests for entertainment video across the various IPTV services through virtualization over Software Defined Networks (SDN), is tremendous in the highest stage of attention. What is necessarily required, is to deliver optimized live and on-demand services like Internet Protocol Service (IPTV Service) with low cost and good quality by strictly fulfill the essential requirements of Clients and ISP’s (Internet Service Provider’s) in the same time. The aim of this study is to present an overview of the important requirements and challenges of IPTV service with two network trends on solving challenges through virtualization (SDN and Network Function Virtualization). This paper provides an overview of researches published in the last five years.

Keywords: challenges, IPTV service, requirements, software defined networking (SDN)

Procedia PDF Downloads 247
4769 Numerical Static and Seismic Evaluation of Pile Group Settlement: A Case Study

Authors: Seyed Abolhassan Naeini, Hamed Yekehdehghan

Abstract:

Shallow foundations cannot be used when the bedding soil is soft. A suitable method for constructing foundations on soft soil is to employ pile groups to transfer the load to the bottom layers. The present research used results from tests carried out in northern Iran (Langarud) and the FLAC3D software to model a pile group for investigating the effects of various parameters on pile cap settlement under static and seismic conditions. According to the results, changes in the strength parameters of the soil, groundwater level, and the length of and distance between the piles affect settlement differently.

Keywords: FLACD 3D software, pile group, settlement, soil

Procedia PDF Downloads 99
4768 Cyber Security Enhancement via Software Defined Pseudo-Random Private IP Address Hopping

Authors: Andre Slonopas, Zona Kostic, Warren Thompson

Abstract:

Obfuscation is one of the most useful tools to prevent network compromise. Previous research focused on the obfuscation of the network communications between external-facing edge devices. This work proposes the use of two edge devices, external and internal facing, which communicate via private IPv4 addresses in a software-defined pseudo-random IP hopping. This methodology does not require additional IP addresses and/or resources to implement. Statistical analyses demonstrate that the hopping surface must be at least 1e3 IP addresses in size with a broad standard deviation to minimize the possibility of coincidence of monitored and communication IPs. The probability of breaking the hopping algorithm requires a collection of at least 1e6 samples, which for large hopping surfaces will take years to collect. The probability of dropped packets is controlled via memory buffers and the frequency of hops and can be reduced to levels acceptable for video streaming. This methodology provides an impenetrable layer of security ideal for information and supervisory control and data acquisition systems.

Keywords: moving target defense, cybersecurity, network security, hopping randomization, software defined network, network security theory

Procedia PDF Downloads 163
4767 Numerical Simulation of Multiple Arrays Arrangement of Micro Hydro Power Turbines

Authors: M. A. At-Tasneem, N. T. Rao, T. M. Y. S. Tuan Ya, M. S. Idris, M. Ammar

Abstract:

River flow over micro hydro power (MHP) turbines of multiple arrays arrangement is simulated with computational fluid dynamics (CFD) software to obtain the flow characteristics. In this paper, CFD software is used to simulate the water flow over MHP turbines as they are placed in a river. Multiple arrays arrangement of MHP turbines lead to generate large amount of power. In this study, a river model is created and simulated in CFD software to obtain the water flow characteristic. The process then continued by simulating different types of arrays arrangement in the river model. A MHP turbine model consists of a turbine outer body and static propeller blade in it. Five types of arrangements are used which are parallel, series, triangular, square and rhombus with different spacing sizes. The velocity profiles on each MHP turbines are identified at the mouth of each turbine bodies. This study is required to obtain the arrangement with increasing spacing sizes that can produce highest power density through the water flow variation.

Keywords: micro hydro power, CFD, arrays arrangement, spacing sizes, velocity profile, power

Procedia PDF Downloads 332
4766 An Extensible Software Infrastructure for Computer Aided Custom Monitoring of Patients in Smart Homes

Authors: Ritwik Dutta, Marylin Wolf

Abstract:

This paper describes the trade-offs and the design from scratch of a self-contained, easy-to-use health dashboard software system that provides customizable data tracking for patients in smart homes. The system is made up of different software modules and comprises a front-end and a back-end component. Built with HTML, CSS, and JavaScript, the front-end allows adding users, logging into the system, selecting metrics, and specifying health goals. The back-end consists of a NoSQL Mongo database, a Python script, and a SimpleHTTPServer written in Python. The database stores user profiles and health data in JSON format. The Python script makes use of the PyMongo driver library to query the database and displays formatted data as a daily snapshot of user health metrics against target goals. Any number of standard and custom metrics can be added to the system, and corresponding health data can be fed automatically, via sensor APIs or manually, as text or picture data files. A real-time METAR request API permits correlating weather data with patient health, and an advanced query system is implemented to allow trend analysis of selected health metrics over custom time intervals. Available on the GitHub repository system, the project is free to use for academic purposes of learning and experimenting, or practical purposes by building on it.

Keywords: flask, Java, JavaScript, health monitoring, long-term care, Mongo, Python, smart home, software engineering, webserver

Procedia PDF Downloads 365
4765 Petra: Simplified, Scalable Verification Using an Object-Oriented, Compositional Process Calculus

Authors: Aran Hakki, Corina Cirstea, Julian Rathke

Abstract:

Formal methods are yet to be utilized in mainstream software development due to issues in scaling and implementation costs. This work is about developing a scalable, simplified, pragmatic, formal software development method with strong correctness properties and guarantees that are easy prove. The method aims to be easy to learn, use and apply without extensive training and experience in formal methods. Petra is proposed as an object-oriented, process calculus with composable data types and sequential/parallel processes. Petra has a simple denotational semantics, which includes a definition of Correct by Construction. The aim is for Petra is to be standard which can be implemented to execute on various mainstream programming platforms such as Java. Work towards an implementation of Petra as a Java EDSL (Embedded Domain Specific Language) is also discussed.

Keywords: compositionality, formal method, software verification, Java, denotational semantics, rewriting systems, rewriting semantics, parallel processing, object-oriented programming, OOP, programming language, correct by construction

Procedia PDF Downloads 117
4764 Adding a Few Language-Level Constructs to Improve OOP Verifiability of Semantic Correctness

Authors: Lian Yang

Abstract:

Object-oriented programming (OOP) is the dominant programming paradigm in today’s software industry and it has literally enabled average software developers to develop millions of commercial strength software applications in the era of INTERNET revolution over the past three decades. On the other hand, the lack of strict mathematical model and domain constraint features at the language level has long perplexed the computer science academia and OOP engineering community. This situation resulted in inconsistent system qualities and hard-to-understand designs in some OOP projects. The difficulties with regards to fix the current situation are also well known. Although the power of OOP lies in its unbridled flexibility and enormously rich data modeling capability, we argue that the ambiguity and the implicit facade surrounding the conceptual model of a class and an object should be eliminated as much as possible. We listed the five major usage of class and propose to separate them by proposing new language constructs. By using well-established theories of set and FSM, we propose to apply certain simple, generic, and yet effective constraints at OOP language level in an attempt to find a possible solution to the above-mentioned issues regarding OOP. The goal is to make OOP more theoretically sound as well as to aid programmers uncover warning signs of irregularities and domain-specific issues in applications early on the development stage and catch semantic mistakes at runtime, improving correctness verifiability of software programs. On the other hand, the aim of this paper is more practical than theoretical.

Keywords: new language constructs, set theory, FSM theory, user defined value type, function groups, membership qualification attribute (MQA), check-constraint (CC)

Procedia PDF Downloads 220
4763 The Secrecy Capacity of the Semi-Deterministic Wiretap Channel with Three State Information

Authors: Mustafa El-Halabi

Abstract:

A general model of wiretap channel with states is considered, where the legitimate receiver and the wiretapper’s observations depend on three states S1, S2 and S3. State S1 is non-causally known to the encoder, S2 is known to the receiver, and S3 remains unknown. A secure coding scheme, based using structured-binning, is proposed, and it is shown to achieve the secrecy capacity when the signal at legitimate receiver is a deterministic function of the input.

Keywords: physical layer security, interference, side information, secrecy capacity

Procedia PDF Downloads 363
4762 Collaborative Platform for Learning Basic Programming (Algorinfo)

Authors: Edgar Mauricio Ruiz Osuna, Claudia Yaneth Herrera Bolivar, Sandra Liliana Gomez Vasquez

Abstract:

The increasing needs of professionals with skills in software development in industry are incremental, therefore, the relevance of an educational process in line with the strengthening of these competencies, are part of the responsibilities of universities with careers related to the area of Informatics and Systems. In this sense, it is important to consider that in the National Science, Technology and Innovation Plan for the development of the Electronics, Information Technologies and Communications (2013) sectors, it is established as a weakness in the SWOT Analysis of the Software sector and Services, Deficiencies in training and professional training. Accordingly, UNIMINUTO's Computer Technology Program has addressed the analysis of students' performance in software development, identifying various problems such as dropout in programming subjects, academic averages, as well as deficiencies in strategies and competencies developed in the area of programming. As a result of this analysis, it was determined to design a collaborative learning platform in basic programming using heat maps as a tool to support didactic feedback. The pilot phase allows to evaluate in a programming course the ALGORINFO platform as a didactic resource, through an interactive and collaborative environment where students can develop basic programming practices and in turn, are fed back through the analysis of time patterns and difficulties frequent in certain segments or program cycles, by means of heat maps. The result allows the teacher to have tools to reinforce and advise critical points generated on the map, so that students and graduates improve their skills as software developers.

Keywords: collaborative platform, learning, feedback, programming, heat maps

Procedia PDF Downloads 136
4761 Partnering with Stakeholders to Secure Digitization of Water

Authors: Sindhu Govardhan, Kenneth G. Crowther

Abstract:

Modernisation of the water sector is leading to increased connectivity and integration of emerging technologies with traditional ones, leading to new security risks. The convergence of Information Technology (IT) with Operation Technology (OT) results in solutions that are spread across larger geographic areas, increasingly consist of interconnected Industrial Internet of Things (IIOT) devices and software, rely on the integration of legacy with modern technologies, use of complex supply chain components leading to complex architectures and communication paths. The result is that multiple parties collectively own and operate these emergent technologies, threat actors find new paths to exploit, and traditional cybersecurity controls are inadequate. Our approach is to explicitly identify and draw data flows that cross trust boundaries between owners and operators of various aspects of these emerging and interconnected technologies. On these data flows, we layer potential attack vectors to create a frame of reference for evaluating possible risks against connected technologies. Finally, we identify where existing controls, mitigations, and other remediations exist across industry partners (e.g., suppliers, product vendors, integrators, water utilities, and regulators). From these, we are able to understand potential gaps in security, the roles in the supply chain that are most likely to effectively remediate those security gaps, and test cases to evaluate and strengthen security across these partners. This informs a “shared responsibility” solution that recognises that security is multi-layered and requires collaboration to be successful. This shared responsibility security framework improves visibility, understanding, and control across the entire supply chain, and particularly for those water utilities that are accountable for safe and continuous operations.

Keywords: cyber security, shared responsibility, IIOT, threat modelling

Procedia PDF Downloads 48
4760 Use Cloud-Based Watson Deep Learning Platform to Train Models Faster and More Accurate

Authors: Susan Diamond

Abstract:

Machine Learning workloads have traditionally been run in high-performance computing (HPC) environments, where users log in to dedicated machines and utilize the attached GPUs to run training jobs on huge datasets. Training of large neural network models is very resource intensive, and even after exploiting parallelism and accelerators such as GPUs, a single training job can still take days. Consequently, the cost of hardware is a barrier to entry. Even when upfront cost is not a concern, the lead time to set up such an HPC environment takes months from acquiring hardware to set up the hardware with the right set of firmware, software installed and configured. Furthermore, scalability is hard to achieve in a rigid traditional lab environment. Therefore, it is slow to react to the dynamic change in the artificial intelligent industry. Watson Deep Learning as a service, a cloud-based deep learning platform that mitigates the long lead time and high upfront investment in hardware. It enables robust and scalable sharing of resources among the teams in an organization. It is designed for on-demand cloud environments. Providing a similar user experience in a multi-tenant cloud environment comes with its own unique challenges regarding fault tolerance, performance, and security. Watson Deep Learning as a service tackles these challenges and present a deep learning stack for the cloud environments in a secure, scalable and fault-tolerant manner. It supports a wide range of deep-learning frameworks such as Tensorflow, PyTorch, Caffe, Torch, Theano, and MXNet etc. These frameworks reduce the effort and skillset required to design, train, and use deep learning models. Deep Learning as a service is used at IBM by AI researchers in areas including machine translation, computer vision, and healthcare. 

Keywords: deep learning, machine learning, cognitive computing, model training

Procedia PDF Downloads 185
4759 The Feasibility of Online, Interactive Workshops to Facilitate Anatomy Education during the UK COVID-19 Lockdowns

Authors: Prabhvir Singh Marway, Kai Lok Chan, Maria-Ruxandra Jinga, Rachel Bok Ying Lee, Matthew Bok Kit Lee, Krishan Nandapalan, Sze Yi Beh, Harry Carr, Christopher Kui

Abstract:

We piloted a structured series of online workshops on the 3D segmentation of anatomical structures from CT scans. 33 participants were recruited from four UK universities for two-day workshops between 2020 and 2021. Open-source software (3D-Slicer) was used. We hypothesized that active participation via real-time screen-sharing and voice-communication via Discord would enable improved engagement and learning, despite national lockdowns. Written feedback indicated positive learning experiences, with subjective measures of anatomical understanding and software confidence improving.

Keywords: medical education, workshop, segmentation, anatomy

Procedia PDF Downloads 166
4758 A Case Study on Re-Assessment Study of an Earthfill Dam at Latamber, Pakistan

Authors: Afnan Ahmad, Shahid Ali, Mujahid Khan

Abstract:

This research presents the parametric study of an existing earth fill dam located at Latamber, Karak city, Pakistan. The study consists of carrying out seepage analysis, slope stability analysis, and Earthquake analysis of the dam for the existing dam geometry and do the same for modified geometry. Dams are massive as well as expensive hydraulic structure, therefore it needs proper attention. Additionally, this dam falls under zone 2B region of Pakistan, which is an earthquake-prone area and where ground accelerations range from 0.16g to 0.24g peak. So it should be deal with great care, as the failure of any dam can cause irreparable losses. Similarly, seepage as well as slope failure can also cause damages which can lead to failure of the dam. Therefore, keeping in view of the importance of dam construction and associated costs, our main focus is to carry out parametric study of newly constructed dam. GeoStudio software is used for this analysis in the study in which Seep/W is used for seepage analysis, Slope/w is used for Slope stability analysis and Quake/w is used for earthquake analysis. Based on the geometrical, hydrological and geotechnical data, Seepage and slope stability analysis of different proposed geometries of the dam are carried out along with the Seismic analysis. A rigorous analysis was carried out in 2-D limit equilibrium using finite element analysis. The seismic study began with the static analysis, continuing by the dynamic response analysis. The seismic analyses permitted evaluation of the overall patterns of the Latamber dam behavior in terms of displacements, stress, strain, and acceleration fields. Similarly, the seepage analysis allows evaluation of seepage through the foundation and embankment of the dam, while slope stability analysis estimates the factor of safety of the upstream and downstream of the dam. The results of the analysis demonstrate that among multiple geometries, Latamber dam is secure against seepage piping failure and slope stability (upstream and downstream) failure. Moreover, the dam is safe against any dynamic loading and no liquefaction has been observed while changing its geometry in permissible limits.

Keywords: earth-fill dam, finite element, liquefaction, seepage analysis

Procedia PDF Downloads 137
4757 An Introduction to E-Content Producing Algorithm for Screen-Recorded Videos

Authors: Jamileh Darsareh, Mohammad Nikafrooz

Abstract:

Some teachers and e-content producers, based on their experiences, try to produce educational videos using screen recording software. There are many challenges that they may encounter while producing screen-recorded videos. These are in the domains of technical and pedagogical challenges like designing the roadmap, preparing the screen, setting the recording software and recording the screen, editing, etc. This study is a descriptive study and tries to present some procedures for producing acceptable and well-made videos. These procedures are presented in the form of an algorithm for producing screen-recorded video. This algorithm presents the main producing phases, including design, pre-production, production, post-production, and distribution. These phases consist of some steps which are supported by several technical and pedagogical considerations. Following these phases and steps according to the suggested order helps the producers to produce their intended and desired video by saving time and also facing fewer technical problems. It is expected that by using this algorithm, e-content producers and teachers gain better performance in producing educational videos.

Keywords: e-content producing algorithm, screen-recorded videos, screen recording software, technical and pedagogical considerations

Procedia PDF Downloads 179
4756 Model of Application of Blockchain Technology in Public Finances

Authors: M. Vlahovic

Abstract:

This paper presents a model of public finances, which combines three concepts: participatory budgeting, crowdfunding and blockchain technology. Participatory budgeting is defined as a process in which community members decide how to spend a part of community’s budget. Crowdfunding is a practice of funding a project by collecting small monetary contributions from a large number of people via an Internet platform. Blockchain technology is a distributed ledger that enables efficient and reliable transactions that are secure and transparent. In this hypothetical model, the government or authorities on local/regional level would set up a platform where they would propose public projects to citizens. Citizens would browse through projects and support or vote for those which they consider justified and necessary. In return, they would be entitled to a tax relief in the amount of their monetary contribution. Since the blockchain technology enables tracking of transactions, it can be used to mitigate corruption, money laundering and lack of transparency in public finances. Models of its application have already been created for e-voting, health records or land registries. By presenting a model of application of blockchain technology in public finances, this paper takes into consideration the potential of blockchain technology to disrupt governments and make processes more democratic, secure, transparent and efficient. The framework for this paper consists of multiple streams of research, including key concepts of direct democracy, public finance (especially the voluntary theory of public finance), information and communication technology, especially blockchain technology and crowdfunding. The framework defines rules of the game, basic conditions for the implementation of the model, benefits, potential problems and development perspectives. As an oversimplified map of a new form of public finances, the proposed model identifies primary factors, that influence the possibility of implementation of the model, and that could be tracked, measured and controlled in case of experimentation with the model.

Keywords: blockchain technology, distributed ledger, participatory budgeting, crowdfunding, direct democracy, internet platform, e-government, public finance

Procedia PDF Downloads 128
4755 Variability Management of Contextual Feature Model in Multi-Software Product Line

Authors: Muhammad Fezan Afzal, Asad Abbas, Imran Khan, Salma Imtiaz

Abstract:

Software Product Line (SPL) paradigm is used for the development of the family of software products that share common and variable features. Feature model is a domain of SPL that consists of common and variable features with predefined relationships and constraints. Multiple SPLs consist of a number of similar common and variable features, such as mobile phones and Tabs. Reusability of common and variable features from the different domains of SPL is a complex task due to the external relationships and constraints of features in the feature model. To increase the reusability of feature model resources from domain engineering, it is required to manage the commonality of features at the level of SPL application development. In this research, we have proposed an approach that combines multiple SPLs into a single domain and converts them to a common feature model. Extracting the common features from different feature models is more effective, less cost and time to market for the application development. For extracting features from multiple SPLs, the proposed framework consists of three steps: 1) find the variation points, 2) find the constraints, and 3) combine the feature models into a single feature model on the basis of variation points and constraints. By using this approach, reusability can increase features from the multiple feature models. The impact of this research is to reduce the development of cost, time to market and increase products of SPL.

Keywords: software product line, feature model, variability management, multi-SPLs

Procedia PDF Downloads 45