Search results for: open source software development
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24544

Search results for: open source software development

24304 Open educational Resources' Metadata: Towards the First Star to Quality of Open Educational Resources

Authors: Audrey Romero-Pelaez, Juan Carlos Morocho-Yunga

Abstract:

The increasing amount of open educational resources (OER) published on the web for consumption in teaching and learning environments also generates a growing need to ensure the quality of these resources. The low level of OER discovery is one of the most significant drawbacks when faced with its reuse, and as a consequence, high-quality educational resources can go unnoticed. Metadata enables the discovery of resources on the web. The purpose of this study is to lay the foundations for open educational resources to achieve their first quality star within the Quality4OER Framework. In this study, we evaluate the quality of OER metadata and establish the main guidelines on metadata quality in this context.

Keywords: open educational resources, OER quality, quality metadata

Procedia PDF Downloads 214
24303 A Cost Effective Approach to Develop Mid-Size Enterprise Software Adopted the Waterfall Model

Authors: Mohammad Nehal Hasnine, Md Kamrul Hasan Chayon, Md Mobasswer Rahman

Abstract:

Organizational tendencies towards computer-based information processing have been observed noticeably in the third-world countries. Many enterprises are taking major initiatives towards computerized working environment because of massive benefits of computer-based information processing. However, designing and developing information resource management software for small and mid-size enterprises under budget costs and strict deadline is always challenging for software engineers. Therefore, we introduced an approach to design mid-size enterprise software by using the Waterfall model, which is one of the SDLC (Software Development Life Cycles), in a cost effective way. To fulfill research objectives, in this study, we developed mid-sized enterprise software named “BSK Management System” that assists enterprise software clients with information resource management and perform complex organizational tasks. Waterfall model phases have been applied to ensure that all functions, user requirements, strategic goals, and objectives are met. In addition, Rich Picture, Structured English, and Data Dictionary have been implemented and investigated properly in engineering manner. Furthermore, an assessment survey with 20 participants has been conducted to investigate the usability and performance of the proposed software. The survey results indicated that our system featured simple interfaces, easy operation and maintenance, quick processing, and reliable and accurate transactions.

Keywords: end-user application development, enterprise software design, information resource management, usability

Procedia PDF Downloads 416
24302 Development of an Interactive and Robust Image Analysis and Diagnostic Tool in R for Early Detection of Cervical Cancer

Authors: Kumar Dron Shrivastav, Ankan Mukherjee Das, Arti Taneja, Harpreet Singh, Priya Ranjan, Rajiv Janardhanan

Abstract:

Cervical cancer is one of the most common cancer among women worldwide which can be cured if detected early. Manual pathology which is typically utilized at present has many limitations. The current gold standard for cervical cancer diagnosis is exhaustive and time-consuming because it relies heavily on the subjective knowledge of the oncopathologists which leads to mis-diagnosis and missed diagnosis resulting false negative and false positive. To reduce time and complexities associated with early diagnosis, we require an interactive diagnostic tool for early detection particularly in developing countries where cervical cancer incidence and related mortality is high. Incorporation of digital pathology in place of manual pathology for cervical cancer screening and diagnosis can increase the precision and strongly reduce the chances of error in a time-specific manner. Thus, we propose a robust and interactive cervical cancer image analysis and diagnostic tool, which can categorically process both histopatholgical and cytopathological images to identify abnormal cells in the least amount of time and settings with minimum resources. Furthermore, incorporation of a set of specific parameters that are typically referred to for identification of abnormal cells with the help of open source software -’R’ is one of the major highlights of the tool. The software has the ability to automatically identify and quantify the morphological features, color intensity, sensitivity and other parameters digitally to differentiate abnormal from normal cells, which may improve and accelerate screening and early diagnosis, ultimately leading to timely treatment of cervical cancer.

Keywords: cervical cancer, early detection, digital Pathology, screening

Procedia PDF Downloads 154
24301 i2kit: A Tool for Immutable Infrastructure Deployments

Authors: Pablo Chico De Guzman, Cesar Sanchez

Abstract:

Microservice architectures are increasingly in distributed cloud applications due to the advantages on the software composition, development speed, release cycle frequency and the business logic time to market. On the other hand, these architectures also introduce some challenges on the testing and release phases of applications. Container technology solves some of these issues by providing reproducible environments, easy of software distribution and isolation of processes. However, there are other issues that remain unsolved in current container technology when dealing with multiple machines, such as networking for multi-host communication, service discovery, load balancing or data persistency (even though some of these challenges are already solved by traditional cloud vendors in a very mature and widespread manner). Container cluster management tools, such as Kubernetes, Mesos or Docker Swarm, attempt to solve these problems by introducing a new control layer where the unit of deployment is the container (or the pod — a set of strongly related containers that must be deployed on the same machine). These tools are complex to configure and manage and they do not follow a pure immutable infrastructure approach since servers are reused between deployments. Indeed, these tools introduce dependencies at execution time for solving networking or service discovery problems. If an error on the control layer occurs, which would affect running applications, specific expertise is required to perform ad-hoc troubleshooting. As a consequence, it is not surprising that container cluster support is becoming a source of revenue for consulting services. This paper presents i2kit, a deployment tool based on the immutable infrastructure pattern, where the virtual machine is the unit of deployment. The input for i2kit is a declarative definition of a set of microservices, where each microservice is defined as a pod of containers. Microservices are built into machine images using linuxkit —- a tool for creating minimal linux distributions specialized in running containers. These machine images are then deployed to one or more virtual machines, which are exposed through a cloud vendor load balancer. Finally, the load balancer endpoint is set into other microservices using an environment variable, providing service discovery. The toolkit i2kit reuses the best ideas from container technology to solve problems like reproducible environments, process isolation, and software distribution, and at the same time relies on mature, proven cloud vendor technology for networking, load balancing and persistency. The result is a more robust system with no learning curve for troubleshooting running applications. We have implemented an open source prototype that transforms i2kit definitions into AWS cloud formation templates, where each microservice AMI (Amazon Machine Image) is created on the fly using linuxkit. Even though container cluster management tools have more flexibility for resource allocation optimization, we defend that adding a new control layer implies more important disadvantages. Resource allocation is greatly improved by using linuxkit, which introduces a very small footprint (around 35MB). Also, the system is more secure since linuxkit installs the minimum set of dependencies to run containers. The toolkit i2kit is currently under development at the IMDEA Software Institute.

Keywords: container, deployment, immutable infrastructure, microservice

Procedia PDF Downloads 157
24300 Flood Early Warning and Management System

Authors: Yogesh Kumar Singh, T. S. Murugesh Prabhu, Upasana Dutta, Girishchandra Yendargaye, Rahul Yadav, Rohini Gopinath Kale, Binay Kumar, Manoj Khare

Abstract:

The Indian subcontinent is severely affected by floods that cause intense irreversible devastation to crops and livelihoods. With increased incidences of floods and their related catastrophes, an Early Warning System for Flood Prediction and an efficient Flood Management System for the river basins of India is a must. Accurately modeled hydrological conditions and a web-based early warning system may significantly reduce economic losses incurred due to floods and enable end users to issue advisories with better lead time. This study describes the design and development of an EWS-FP using advanced computational tools/methods, viz. High-Performance Computing (HPC), Remote Sensing, GIS technologies, and open-source tools for the Mahanadi River Basin of India. The flood prediction is based on a robust 2D hydrodynamic model, which solves shallow water equations using the finite volume method. Considering the complexity of the hydrological modeling and the size of the basins in India, it is always a tug of war between better forecast lead time and optimal resolution at which the simulations are to be run. High-performance computing technology provides a good computational means to overcome this issue for the construction of national-level or basin-level flash flood warning systems having a high resolution at local-level warning analysis with a better lead time. High-performance computers with capacities at the order of teraflops and petaflops prove useful while running simulations on such big areas at optimum resolutions. In this study, a free and open-source, HPC-based 2-D hydrodynamic model, with the capability to simulate rainfall run-off, river routing, and tidal forcing, is used. The model was tested for a part of the Mahanadi River Basin (Mahanadi Delta) with actual and predicted discharge, rainfall, and tide data. The simulation time was reduced from 8 hrs to 3 hrs by increasing CPU nodes from 45 to 135, which shows good scalability and performance enhancement. The simulated flood inundation spread and stage were compared with SAR data and CWC Observed Gauge data, respectively. The system shows good accuracy and better lead time suitable for flood forecasting in near-real-time. To disseminate warning to the end user, a network-enabled solution is developed using open-source software. The system has query-based flood damage assessment modules with outputs in the form of spatial maps and statistical databases. System effectively facilitates the management of post-disaster activities caused due to floods, like displaying spatial maps of the area affected, inundated roads, etc., and maintains a steady flow of information at all levels with different access rights depending upon the criticality of the information. It is designed to facilitate users in managing information related to flooding during critical flood seasons and analyzing the extent of the damage.

Keywords: flood, modeling, HPC, FOSS

Procedia PDF Downloads 73
24299 Improving Security by Using Secure Servers Communicating via Internet with Standalone Secure Software

Authors: Carlos Gonzalez

Abstract:

This paper describes the use of the Internet as a feature to enhance the security of our software that is going to be distributed/sold to users potentially all over the world. By placing in a secure server some of the features of the secure software, we increase the security of such software. The communication between the protected software and the secure server is done by a double lock algorithm. This paper also includes an analysis of intruders and describes possible responses to detect threats.

Keywords: internet, secure software, threats, cryptography process

Procedia PDF Downloads 304
24298 China's Soft Power and Its Strategy in West Asia

Authors: Iman Shabanzadeh

Abstract:

The economic growth and the special model of development in China have caused sensitivity in the public opinion of the world regarding the nature of this growth and development. In this regard, the Chinese have tried to put an end to such alarming procedures by using all the tools at their disposal and seek to present a peaceful and cooperative image of themselves. In this way, one of the most important diplomatic tools that Beijing has used to reduce the concerns caused by the Threat Theory has been the use of soft power resources and its tools in its development policies. This article begins by analyzing the concept of soft power and examining its foundations in international relations, and continues to examine the components of soft power in its Chinese version. The main purpose of the article is to figure out about the position of West Asia in China's soft power strategy and resources China use to achieve its goals in this region. In response to the main question, the paper's hypothesis is that soft power in its Chinese version had significant differences from Joseph Nye's original idea. In fact, the Chinese have imported the American version of soft power and adjusted, strengthened and, in other words, internalized it with their abilities, capacities and political philosophy. Based on this, China's software presence in West Asia can be traced in three areas. The first source of China's soft power in this region of West Asia is cultural in nature and is realized through strategies such as "use of educational tools and methods", "media methods" and "tourism industry". The second source is related to political soft power, which is applied through the policy of "balance of influence" and the policy of "mediation" and relying on the "ideological foundations of Confucianism". The third source also refers to China's economic soft power and is realized through three tools: "energy exchanges", "foreign investments" and "Belt-Road initiative". The research method of this article is descriptive-analytical.

Keywords: soft power, cooperative power, china, west asia

Procedia PDF Downloads 30
24297 Statistical Analysis with Prediction Models of User Satisfaction in Software Project Factors

Authors: Katawut Kaewbanjong

Abstract:

We analyzed a volume of data and found significant user satisfaction in software project factors. A statistical significance analysis (logistic regression) and collinearity analysis determined the significance factors from a group of 71 pre-defined factors from 191 software projects in ISBSG Release 12. The eight prediction models used for testing the prediction potential of these factors were Neural network, k-NN, Naïve Bayes, Random forest, Decision tree, Gradient boosted tree, linear regression and logistic regression prediction model. Fifteen pre-defined factors were truly significant in predicting user satisfaction, and they provided 82.71% prediction accuracy when used with a neural network prediction model. These factors were client-server, personnel changes, total defects delivered, project inactive time, industry sector, application type, development type, how methodology was acquired, development techniques, decision making process, intended market, size estimate approach, size estimate method, cost recording method, and effort estimate method. These findings may benefit software development managers considerably.

Keywords: prediction model, statistical analysis, software project, user satisfaction factor

Procedia PDF Downloads 95
24296 Identifying Metabolic Pathways Associated with Neuroprotection Mediated by Tibolone in Human Astrocytes under an Induced Inflammatory Model

Authors: Daniel Osorio, Janneth Gonzalez, Andres Pinzon

Abstract:

In this work, proteins and metabolic pathways associated with the neuroprotective response mediated by the synthetic neurosteroid tibolone under a palmitate-induced inflammatory model were identified by flux balance analysis (FBA). Three different metabolic scenarios (‘healthy’, ‘inflamed’ and ‘medicated’) were modeled over a gene expression data-driven constructed tissue-specific metabolic reconstruction of mature astrocytes. Astrocyte reconstruction was built, validated and constrained using three open source software packages (‘minval’, ‘g2f’ and ‘exp2flux’) released through the Comprehensive R Archive Network repositories during the development of this work. From our analysis, we predict that tibolone executes their neuroprotective effects through a reduction of neurotoxicity mediated by L-glutamate in astrocytes, inducing the activation several metabolic pathways with neuroprotective actions associated such as taurine metabolism, gluconeogenesis, calcium and the Peroxisome Proliferator Activated Receptor signaling pathways. Also, we found a tibolone associated increase in growth rate probably in concordance with previously reported side effects of steroid compounds in other human cell types.

Keywords: astrocytes, flux balance analysis, genome scale metabolic reconstruction, inflammation, neuroprotection, tibolone

Procedia PDF Downloads 204
24295 Hybrid Approach for Software Defect Prediction Using Machine Learning with Optimization Technique

Authors: C. Manjula, Lilly Florence

Abstract:

Software technology is developing rapidly which leads to the growth of various industries. Now-a-days, software-based applications have been adopted widely for business purposes. For any software industry, development of reliable software is becoming a challenging task because a faulty software module may be harmful for the growth of industry and business. Hence there is a need to develop techniques which can be used for early prediction of software defects. Due to complexities in manual prediction, automated software defect prediction techniques have been introduced. These techniques are based on the pattern learning from the previous software versions and finding the defects in the current version. These techniques have attracted researchers due to their significant impact on industrial growth by identifying the bugs in software. Based on this, several researches have been carried out but achieving desirable defect prediction performance is still a challenging task. To address this issue, here we present a machine learning based hybrid technique for software defect prediction. First of all, Genetic Algorithm (GA) is presented where an improved fitness function is used for better optimization of features in data sets. Later, these features are processed through Decision Tree (DT) classification model. Finally, an experimental study is presented where results from the proposed GA-DT based hybrid approach is compared with those from the DT classification technique. The results show that the proposed hybrid approach achieves better classification accuracy.

Keywords: decision tree, genetic algorithm, machine learning, software defect prediction

Procedia PDF Downloads 313
24294 Conceptualization and Strategies of Biogas Technology for Rural Development in Nigeria

Authors: Okorowo Cyril Agochi

Abstract:

The main challenge of present world is to harness the energy source which is environment friendly and ecologically balanced. This need has forced to search for other alternate source of energy. But unfortunately the new alternative energy sources like the solar, hydro, wind etc. require huge economical value and technical power to operate, which seem to be very difficult for the developing countries like Nigeria. In the present moment biogas energy can be one and only reliable, easily available and economically feasible source of alternative and renewable source which can be managed by locally available sources and simple technology for secondary schools, tertiary institution and rural villages. This paper is aimed at boosting the energy generation for developing of rural Nigeria, through Biogas.

Keywords: bio-gas, energy, environment, nigeria, technology

Procedia PDF Downloads 458
24293 User-Perceived Quality Factors for Certification Model of Web-Based System

Authors: Jamaiah H. Yahaya, Aziz Deraman, Abdul Razak Hamdan, Yusmadi Yah Jusoh

Abstract:

One of the most essential issues in software products is to maintain it relevancy to the dynamics of the user’s requirements and expectation. Many studies have been carried out in quality aspect of software products to overcome these problems. Previous software quality assessment models and metrics have been introduced with strengths and limitations. In order to enhance the assurance and buoyancy of the software products, certification models have been introduced and developed. From our previous experiences in certification exercises and case studies collaborating with several agencies in Malaysia, the requirements for user based software certification approach is identified and demanded. The emergence of social network applications, the new development approach such as agile method and other varieties of software in the market have led to the domination of users over the software. As software become more accessible to the public through internet applications, users are becoming more critical in the quality of the services provided by the software. There are several categories of users in web-based systems with different interests and perspectives. The classifications and metrics are identified through brain storming approach with includes researchers, users and experts in this area. The new paradigm in software quality assessment is the main focus in our research. This paper discusses the classifications of users in web-based software system assessment and their associated factors and metrics for quality measurement. The quality model is derived based on IEEE structure and FCM model. The developments are beneficial and valuable to overcome the constraints and improve the application of software certification model in future.

Keywords: software certification model, user centric approach, software quality factors, metrics and measurements, web-based system

Procedia PDF Downloads 382
24292 Healthcare Big Data Analytics Using Hadoop

Authors: Chellammal Surianarayanan

Abstract:

Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.

Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare

Procedia PDF Downloads 385
24291 Neural Network Based Approach of Software Maintenance Prediction for Laboratory Information System

Authors: Vuk M. Popovic, Dunja D. Popovic

Abstract:

Software maintenance phase is started once a software project has been developed and delivered. After that, any modification to it corresponds to maintenance. Software maintenance involves modifications to keep a software project usable in a changed or a changing environment, to correct discovered faults, and modifications, and to improve performance or maintainability. Software maintenance and management of software maintenance are recognized as two most important and most expensive processes in a life of a software product. This research is basing the prediction of maintenance, on risks and time evaluation, and using them as data sets for working with neural networks. The aim of this paper is to provide support to project maintenance managers. They will be able to pass the issues planned for the next software-service-patch to the experts, for risk and working time evaluation, and afterward to put all data to neural networks in order to get software maintenance prediction. This process will lead to the more accurate prediction of the working hours needed for the software-service-patch, which will eventually lead to better planning of budget for the software maintenance projects.

Keywords: laboratory information system, maintenance engineering, neural networks, software maintenance, software maintenance costs

Procedia PDF Downloads 328
24290 Knowledge Based Automated Software Engineering Platform Used for the Development of Bulgarian E-Customs

Authors: Ivan Stanev, Maria Koleva

Abstract:

Described are challenges to the Bulgarian e-Customs (BeC) related to low level of interoperability and standardization, inefficient use of available infrastructure, lack of centralized identification and authorization, extremely low level of software process automation, and insufficient quality of data stored in official registers. The technical requirements for BeC are prepared with a focus on domain independent common platform, specialized customs and excise components, high scalability, flexibility, and reusability. The Knowledge Based Automated Software Engineering (KBASE) Common Platform for Automated Programming (CPAP) is selected as an instrument covering BeC requirements for standardization, programming automation, knowledge interpretation and cloud computing. BeC stage 3 results are presented and analyzed. BeC.S3 development trends are identified.

Keywords: service oriented architecture, cloud computing, knowledge based automated software engineering, common platform for automated programming, e-customs

Procedia PDF Downloads 352
24289 Review of Currently Adopted Intelligent Programming Tutors

Authors: Rita Garcia

Abstract:

Intelligent Programming Tutors, IPTs, are supplemental educational devices that assist in teaching software development. These systems provide customized learning allowing the user to select the presentation pace, pedagogical strategy, and to recall previous and additional teaching materials reinforcing learning objectives. In addition, IPTs automatically records individual’s progress, providing feedback to the instructor and student. These tutoring systems have an advantage over Tutoring Systems because Intelligent Programming Tutors are not limited to one teaching strategy and can adjust when it detects the user struggling with a concept. The Intelligent Programming Tutor is a category of Intelligent Tutoring Systems, ITS. ITS are available for many fields in education, supporting different learning objectives and integrate into other learning tools, improving the student's learning experience. This study provides a comparison of the IPTs currently adopted by the educational community and will focus on the different teaching methodologies and programming languages. The study also includes the ability to integrate the IPT into other educational technologies, such as massive open online courses, MOOCs. The intention of this evaluation is to determine one system that would best serve in a larger ongoing research project and provide findings for other institutions looking to adopt an Intelligent Programming Tutor.

Keywords: computer education tools, integrated software development assistance, intelligent programming tutors, tutoring systems

Procedia PDF Downloads 296
24288 Software Quality Measurement System for Telecommunication Industry in Malaysia

Authors: Nor Fazlina Iryani Abdul Hamid, Mohamad Khatim Hasan

Abstract:

Evolution of software quality measurement has been started since McCall introduced his quality model in year 1977. Starting from there, several software quality models and software quality measurement methods had emerged but none of them focused on telecommunication industry. In this paper, the implementation of software quality measurement system for telecommunication industry was compulsory to accommodate the rapid growth of telecommunication industry. The quality value of the telecommunication related software could be calculated using this system by entering the required parameters. The system would calculate the quality value of the measured system based on predefined quality metrics and aggregated by referring to the quality model. It would classify the quality level of the software based on Net Satisfaction Index (NSI). Thus, software quality measurement system was important to both developers and users in order to produce high quality software product for telecommunication industry.

Keywords: software quality, quality measurement, quality model, quality metric, net satisfaction index

Procedia PDF Downloads 565
24287 Framework Development of Carbon Management Software Tool in Sustainable Supply Chain Management of Indian Industry

Authors: Sarbjit Singh

Abstract:

This framework development explored the status of GSCM in manufacturing SMEs and concluded that there was a significant gap w.r.t carbon emissions measurement in the supply chain activities. The measurement of carbon emissions within supply chains is important green initiative toward its reduction. The majority of the SMEs were facing the problem to quantify the green house gas emissions in its supply chain & to make it a low carbon supply chain or GSCM. Thus, the carbon management initiatives were amalgamated with the supply chain activities in order to measure and reduce the carbon emissions, confirming the GHG protocol scopes. Henceforth, it covers the development of carbon management software (CMS) tool to quantify carbon emissions for effective carbon management. This tool is cheap and easy to use for the industries for the management of their carbon emissions within the supply chain.

Keywords: w.r.t carbon emissions, carbon management software, supply chain management, Indian Industry

Procedia PDF Downloads 440
24286 Large Eddy Simulation of Particle Clouds Using Open-Source CFD

Authors: Ruo-Qian Wang

Abstract:

Open-source CFD has become increasingly popular and promising. The recent progress in multiphase flow enables new CFD applications, which provides an economic and flexible research tool for complex flow problems. Our numerical study using four-way coupling Euler-Lagrangian Large-Eddy Simulations to resolve particle cloud dynamics with OpenFOAM and CFDEM will be introduced: The fractioned Navier-Stokes equations are numerically solved for fluid phase motion, solid phase motion is addressed by Lagrangian tracking for every single particle, and total momentum is conserved by fluid-solid inter-phase coupling. The grid convergence test was performed, which proves the current resolution of the mesh is appropriate. Then, we validated the code by comparing numerical results with experiments in terms of particle cloud settlement and growth. A good comparison was obtained showing reliability of the present numerical schemes. The time and height at phase separations were defined and analyzed for a variety of initial release conditions. Empirical formulas were drawn to fit the results.

Keywords: four-way coupling, dredging, land reclamation, multiphase flows, oil spill

Procedia PDF Downloads 406
24285 Ecorium: The Ecological Project in Montevideo Uruguay

Authors: Chettou Souhaila, Soufi Omar, Roumia Mohammed Ammar

Abstract:

Protecting the environment is to preserve the survival and future of humanity. Indeed, the environment is our source of food and drinking water, the air is our source of oxygen, the climate allows our survival and biodiversity are a potential drug reservoir. Preserving the environment is, therefore, a matter of survival. The objective of this project is to familiarize the general public with environmental problems not only with the theme of environmental protection, but also with the concept of biodiversity in different ecosystems. For it, the aim of our project was to create the Ecorium which is a place that preserves many species of plants of different ecosystems, schools, malls, buildings, offices, ecological transports, gardens, and many familial activities that participated in the ecosystems development, strategic biodiversity and sustainable development.

Keywords: ecological system, ecorium, environment, sustainable development

Procedia PDF Downloads 310
24284 LGG Architecture for Brain Tumor Segmentation Using Convolutional Neural Network

Authors: Sajeeha Ansar, Asad Ali Safi, Sheikh Ziauddin, Ahmad R. Shahid, Faraz Ahsan

Abstract:

The most aggressive form of brain tumor is called glioma. Glioma is kind of tumor that arises from glial tissue of the brain and occurs quite often. A fully automatic 2D-CNN model for brain tumor segmentation is presented in this paper. We performed pre-processing steps to remove noise and intensity variances using N4ITK and standard intensity correction, respectively. We used Keras open-source library with Theano as backend for fast implementation of CNN model. In addition, we used BRATS 2015 MRI dataset to evaluate our proposed model. Furthermore, we have used SimpleITK open-source library in our proposed model to analyze images. Moreover, we have extracted random 2D patches for proposed 2D-CNN model for efficient brain segmentation. Extracting 2D patched instead of 3D due to less dimensional information present in 2D which helps us in reducing computational time. Dice Similarity Coefficient (DSC) is used as performance measure for the evaluation of the proposed method. Our method achieved DSC score of 0.77 for complete, 0.76 for core, 0.77 for enhanced tumor regions. However, these results are comparable with methods already implemented 2D CNN architecture.

Keywords: brain tumor segmentation, convolutional neural networks, deep learning, LGG

Procedia PDF Downloads 161
24283 Software Defect Analysis- Eclipse Dataset

Authors: Amrane Meriem, Oukid Salyha

Abstract:

The presence of defects or bugs in software can lead to costly setbacks, operational inefficiencies, and compromised user experiences. The integration of Machine Learning(ML) techniques has emerged to predict and preemptively address software defects. ML represents a proactive strategy aimed at identifying potential anomalies, errors, or vulnerabilities within code before they manifest as operational issues. By analyzing historical data, such as code changes, feature im- plementations, and defect occurrences. This en- ables development teams to anticipate and mitigate these issues, thus enhancing software quality, reducing maintenance costs, and ensuring smoother user interactions. In this work, we used a recommendation system to improve the performance of ML models in terms of predicting the code severity and effort estimation.

Keywords: software engineering, machine learning, bugs detection, effort estimation

Procedia PDF Downloads 61
24282 A Guide to the Implementation of Ambisonics Super Stereo

Authors: Alessio Mastrorillo, Giuseppe Silvi, Francesco Scagliola

Abstract:

In this work, we introduce an Ambisonics decoder with an implementation of the C-format, also called Super Stereo. This format is an alternative to conventional stereo and binaural decoding. Unlike those, this format conveys audio information from the horizontal plane and works with stereo speakers and headphones. The two C-format channels can also return a reconstructed planar B-format. This work provides an open-source implementation for this format. We implement an all-pass filter for signal quadrature, as required by the decoding equations. This filter works with six Biquads in a cascade configuration, with values for control frequency and quality factor discovered experimentally. The phase response of the filter delivers a small error in the 20-14.000Hz range. The decoder has been tested with audio sources up to 192kHz sample rate, returning pristine sound quality and detailed stereo image. It has been included in the Envelop for Live suite and is available as an open-source repository. This decoder has applications in Virtual Reality and 360° audio productions, music composition, and online streaming.

Keywords: ambisonics, UHJ, quadrature filter, virtual reality, Gerzon, decoder, stereo, binaural, biquad

Procedia PDF Downloads 72
24281 Conception of Increasing the Efficiency of Excavation Shoring by Prestressing Diaphragm Walls

Authors: Mateusz Frydrych

Abstract:

The construction of diaphragm walls as excavation shoring as well as part of deep foundations is widely used in geotechnical engineering. Today's design challenges lie in the optimal dimensioning of the cross-section, which is demanded by technological considerations. Also in force is the issue of optimization and sustainable use of construction materials, including reduction of carbon footprint, which is currently a relevant challenge for the construction industry. The author presents the concept of an approach to achieving increased efficiency of diaphragm wall excavation shoring by using structural compression technology. The author proposes to implement prestressed tendons in a non-linear manner in the reinforcement cage. As a result bending moment is reduced, which translates into a reduction in the amount of steel needed in the section, a reduction in displacements, and a reduction in the scratching of the casing, including the achievement of better tightness. This task is rarely seen and has not yet been described in a scientific way in the literature. The author has developed a dynamic numerical model that allows the dimensioning of the cross-section of a prestressed shear wall, as well as the study of casing displacements and cross-sectional forces in any defined computational situation. Numerical software from the Sofistik - open source development environment - was used for the study, and models were validated in Plaxis software . This is an interesting idea that allows for optimizing the execution of construction works and reducing the required resources by using fewer materials and saving time. The author presents the possibilities of a prestressed diaphragm wall, among others, using. The example of a diaphragm wall working as a cantilever at the height of two underground floors without additional strutting or stability protection by using ground anchors. This makes the execution of the work more criminal for the contractor and, as a result, cheaper for the investor.

Keywords: prestressed diaphragm wall, Plaxis, Sofistik, innovation, FEM, optimisation

Procedia PDF Downloads 54
24280 Open Consent And Artificial Intelligence For Health Research in South Africa

Authors: Amy Gooden

Abstract:

Various modes of consent have been utilized in health research, but open consent has not been explored in South Africa’s AI research context. Open consent entails the sharing of data without assurances of privacy and may be seen as an attempt to marry open science with informed consent. Because all potential uses of data are unknown, it has been questioned whether consent can be informed. Instead of trying to adapt existing modes of consent, why not adopt a new perspective? This is what open consent proposes and what this research will explore in AI health research in South Africa.

Keywords: artificial intelligence, consent, health, law, research, South Africa

Procedia PDF Downloads 131
24279 Four Phase Methodology for Developing Secure Software

Authors: Carlos Gonzalez-Flores, Ernesto Liñan-García

Abstract:

A simple and robust approach for developing secure software. A Four Phase methodology consists in developing the non-secure software in phase one, and for the next three phases, one phase for each of the secure developing types (i.e. self-protected software, secure code transformation, and the secure shield). Our methodology requires first the determination and understanding of the type of security level needed for the software. The methodology proposes the use of several teams to accomplish this task. One Software Engineering Developing Team, a Compiler Team, a Specification and Requirements Testing Team, and for each of the secure software developing types: three teams of Secure Software Developing, three teams of Code Breakers, and three teams of Intrusion Analysis. These teams will interact among each other and make decisions to provide a secure software code protected against a required level of intruder.

Keywords: secure software, four phases methodology, software engineering, code breakers, intrusion analysis

Procedia PDF Downloads 377
24278 Group Learning for the Design of Human Resource Development for Enterprise

Authors: Hao-Hsi Tseng, Hsin-Yun Lee, Yu-Cheng Kuo

Abstract:

In order to understand whether there is a better than the learning function of learning methods and improve the CAD Courses for enterprise’s design human resource development, this research is applied in learning practical learning computer graphics software. In this study, Revit building information model for learning content, design of two different modes of learning curriculum to learning, learning functions, respectively, and project learning. Via a post-test, questionnaires and student interviews, etc., to study the effectiveness of a comparative analysis of two different modes of learning. Students participate in a period of three weeks after a total of nine-hour course, and finally written and hands-on test. In addition, fill in the questionnaire response by the student learning, a total of fifteen questionnaire title, problem type into the base operating software, application software and software-based concept features three directions. In addition to the questionnaire, and participants were invited to two different learning methods to conduct interviews to learn more about learning students the idea of two different modes. The study found that the ad hoc short-term courses in learning, better learning outcomes. On the other hand, functional style for the whole course students are more satisfied, and the ad hoc style student is difficult to accept the ad hoc style of learning.

Keywords: development, education, human resource, learning

Procedia PDF Downloads 338
24277 A Systematic Snapshot of Software Outsourcing Challenges

Authors: Issam Jebreen, Eman Al-Qbelat

Abstract:

Outsourcing software development projects can be challenging, and there are several common challenges that organizations face. A study was conducted with a sample of 46 papers on outsourcing challenges, and the results show that there are several common challenges faced by organizations when outsourcing software development projects. Poor outsourcing relationship was identified as the most significant challenge, with 35% of the papers referencing it. Lack of quality was the second most significant challenge, with 33% of the papers referencing it. Language and cultural differences were the third most significant challenge, with 24% of the papers referencing it. Non-competitive price was another challenge faced by organizations, with 21% of the papers referencing it. Poor coordination and communication were also identified as a challenge, with 21% of the papers referencing it. Opportunistic behavior, lack of contract negotiation, inadequate user involvement, and constraints due to time zone were also challenges faced by organizations. Other challenges faced by organizations included poor project management, lack of technical capabilities, vendor employee high turnover, poor requirement specification, IPR issues, poor management of budget, schedule, and delay, geopolitical and country instability, the difference in development methodologies, failure to manage end-user expectations, and poor monitoring and control. In conclusion, outsourcing software development projects can be challenging, but organizations can mitigate these challenges by selecting the right outsourcing partner, having a well-defined contract and clear communication, having a clear understanding of the requirements, and implementing effective project management practices.

Keywords: software outsourcing, vendor, outsourcing challenges, quality model, continent, country, global outsourcing, IT workforce outsourcing.

Procedia PDF Downloads 68
24276 Preliminary Findings from a Research Survey on Evolution of Software Defined Radio

Authors: M. Srilatha, R. Hemalatha, T. Sri Aditya

Abstract:

Communication of today world is dominated by wireless technology. This is mainly due to the revolutionary development of new wireless communication system generations. The evolving new generations of wireless systems are accommodating the demand through better resource management including improved transmission technologies with optimized communication devices. To keep up with the evolution of technologies, the communication systems must be designed to optimize transparent insertion of newly evolved technologies virtually at all stages of their life cycle. After the insertion of new technologies, the upgraded devices should continue the communication without squalor in quality. The concern of improving spectrum access and spectrum efficiency combined with both the introduction of Software Defined Radios (SDR) and the possibility of the software application to radios has led to an evolution of wireless radio research. The software defined radio term was coined in the 1970s to overcome the problems in the application of software to wireless radios which eliminates the requirement of hardware changes. SDR has become the prime theme of research since it eliminates the drawbacks associated with conventional wireless communication systems implementation. This paper identifies and discusses key enabling technologies and possibility of research and development in SDRs. In addition transmitter and receiver architectures of SDR are also discussed along with their feasibility for reconfigurable radio application.

Keywords: software defined radios, wireless communication, reconfigurable, reconfigurable transmitter, reconfigurable receivers, FPGA, DSP

Procedia PDF Downloads 296
24275 Characterisation of Human Attitudes in Software Requirements Elicitation

Authors: Mauro Callejas-Cuervo, Andrea C. Alarcon-Aldana

Abstract:

It is evident that there has been progress in the development and innovation of tools, techniques and methods in the development of software. Even so, there are few methodologies that include the human factor from the point of view of motivation, emotions and impact on the work environment; aspects that, when mishandled or not taken into consideration, increase the iterations in the requirements elicitation phase. This generates a broad number of changes in the characteristics of the system during its developmental process and an overinvestment of resources to obtain a final product that, often, does not live up to the expectations and needs of the client. The human factors such as emotions or personality traits are naturally associated with the process of developing software. However, most of these jobs are oriented towards the analysis of the final users of the software and do not take into consideration the emotions and motivations of the members of the development team. Given that in the industry, the strategies to select the requirements engineers and/or the analysts do not take said factors into account, it is important to identify and describe the characteristics or personality traits in order to elicit requirements effectively. This research describes the main personality traits associated with the requirements elicitation tasks through the analysis of the existing literature on the topic and a compilation of our experiences as software development project managers in the academic and productive sectors; allowing for the characterisation of a suitable profile for this job. Moreover, a psychometric test is used as an information gathering technique, and it is applied to the personnel of some local companies in the software development sector. Such information has become an important asset in order to make a comparative analysis between the degree of effectiveness in the way their software development teams are formed and the proposed profile. The results show that of the software development companies studied: 53.58% have selected the personnel for the task of requirements elicitation adequately, 37.71% possess some of the characteristics to perform the task, and 10.71% are inadequate. From the previous information, it is possible to conclude that 46.42% of the requirements engineers selected by the companies could perform other roles more adequately; a change which could improve the performance and competitiveness of the work team and, indirectly, the quality of the product developed. Likewise, the research allowed for the validation of the pertinence and usefulness of the psychometric instrument as well as the accuracy of the characteristics for the profile of requirements engineer proposed as a reference.

Keywords: emotions, human attitudes, personality traits, psychometric tests, requirements engineering

Procedia PDF Downloads 245