Search results for: cloud computing privacy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1701

Search results for: cloud computing privacy

351 Theoretical Analysis of the Solid State and Optical Characteristics of Calcium Sulpide Thin Film

Authors: Emmanuel Ifeanyi Ugwu

Abstract:

Calcium Sulphide which is one of Chalcogenide group of thin films has been analyzed in this work using a theoretical approach in which a scalar wave was propagated through the material thin film medium deposited on a glass substrate with the assumption that the dielectric medium has homogenous reference dielectric constant term, and a perturbed dielectric function, representing the deposited thin film medium on the surface of the glass substrate as represented in this work. These were substituted into a defined scalar wave equation that was solved first of all by transforming it into Volterra equation of second type and solved using the method of separation of variable on scalar wave and subsequently, Green’s function technique was introduced to obtain a model equation of wave propagating through the thin film that was invariably used in computing the propagated field, for different input wavelengths representing UV, Visible and Near-infrared regions of field considering the influence of the dielectric constants of the thin film on the propagating field. The results obtained were used in turn to compute the band gaps, solid state and optical properties of the thin film.

Keywords: scalar wave, dielectric constant, calcium sulphide, solid state, optical properties

Procedia PDF Downloads 78
350 Review of the Legislative and Policy Issues in Promoting Infrastructure Development to Promote Automation in Telecom Industry

Authors: Marvin Ricardo Awarab

Abstract:

There has never been a greater need for telecom services. The Internet of Things (IoT), 5G networking, and edge computing are the driving forces behind this increased demand. The fierce demand offers communications service providers significant income opportunities. The telecom sector is centered on automation, and realizing a digital operation that functions as a real-time business will be crucial for the industry as a whole. Automation in telecom refers to the application of technology to create a more effective, quick, and scalable alternative to the conventional method of operating the telecom industry. With the promotion of 5G and the Internet of Things (IoT), telecom companies will continue to invest extensively in telecom automation technology. Automation offers benefits in the telecom industry; developing countries such as Namibia may not fully tap into such benefits because of the lack of funds and infrastructural resources to invest in automation. This paper fully investigates the benefits of automation in the telecom industry. Furthermore, the paper identifies hiccups that developing countries such as Namibia face in their quest to fully introduce automation in the telecom industry. Additionally, the paper proposes possible avenues that Namibia, as a developing country, adopt investing in automation infrastructural resources with the aim of reaping the full benefits of automation in the telecom industry.

Keywords: automation, development, internet, internet of things, network, telecom, telecommunications policy, 5G

Procedia PDF Downloads 37
349 The Use of Building Energy Simulation Software in Case Studies: A Literature Review

Authors: Arman Ameen, Mathias Cehlin

Abstract:

The use of Building Energy Simulation (BES) software has increased in the last two decades, parallel to the development of increased computing power and easy to use software applications. This type of software is primarily used to simulate the energy use and the indoor environment for a building. The rapid development of these types of software has raised their level of user-friendliness, better parameter input options and the increased possibility of analysis, both for a single building component or an entire building. This, in turn, has led to many researchers utilizing BES software in their research in various degrees. The aim of this paper is to carry out a literature review concerning the use of the BES software IDA Indoor Climate and Energy (IDA ICE) in the scientific community. The focus of this paper will be specifically the use of the software for whole building energy simulation, number and types of articles and publications dates, the area of application, types of parameters used, the location of the studied building, type of building, type of analysis and solution methodology. Another aspect that is examined, which is of great interest, is the method of validations regarding the simulation results. The results show that there is an upgoing trend in the use of IDA ICE and that researchers use the software in their research in various degrees depending on case and aim of their research. The satisfactory level of validation of the simulations carried out in these articles varies depending on the type of article and type of analysis.

Keywords: building simulation, IDA ICE, literature review, validation

Procedia PDF Downloads 114
348 Continuous Differential Evolution Based Parameter Estimation Framework for Signal Models

Authors: Ammara Mehmood, Aneela Zameer, Muhammad Asif Zahoor Raja, Muhammad Faisal Fateh

Abstract:

In this work, the strength of bio-inspired computational intelligence based technique is exploited for parameter estimation for the periodic signals using Continuous Differential Evolution (CDE) by defining an error function in the mean square sense. Multidimensional and nonlinear nature of the problem emerging in sinusoidal signal models along with noise makes it a challenging optimization task, which is dealt with robustness and effectiveness of CDE to ensure convergence and avoid trapping in local minima. In the proposed scheme of Continuous Differential Evolution based Signal Parameter Estimation (CDESPE), unknown adjustable weights of the signal system identification model are optimized utilizing CDE algorithm. The performance of CDESPE model is validated through statistics based various performance indices on a sufficiently large number of runs in terms of estimation error, mean squared error and Thiel’s inequality coefficient. Efficacy of CDESPE is examined by comparison with the actual parameters of the system, Genetic Algorithm based outcomes and from various deterministic approaches at different signal-to-noise ratio (SNR) levels.

Keywords: parameter estimation, bio-inspired computing, continuous differential evolution (CDE), periodic signals

Procedia PDF Downloads 275
347 The Data-Driven Localized Wave Solution of the Fokas-Lenells Equation using PINN

Authors: Gautam Kumar Saharia, Sagardeep Talukdar, Riki Dutta, Sudipta Nandy

Abstract:

The physics informed neural network (PINN) method opens up an approach for numerically solving nonlinear partial differential equations leveraging fast calculating speed and high precession of modern computing systems. We construct the PINN based on strong universal approximation theorem and apply the initial-boundary value data and residual collocation points to weekly impose initial and boundary condition to the neural network and choose the optimization algorithms adaptive moment estimation (ADAM) and Limited-memory Broyden-Fletcher-Golfard-Shanno (L-BFGS) algorithm to optimize learnable parameter of the neural network. Next, we improve the PINN with a weighted loss function to obtain both the bright and dark soliton solutions of Fokas-Lenells equation (FLE). We find the proposed scheme of adjustable weight coefficients into PINN has a better convergence rate and generalizability than the basic PINN algorithm. We believe that the PINN approach to solve the partial differential equation appearing in nonlinear optics would be useful to study various optical phenomena.

Keywords: deep learning, optical Soliton, neural network, partial differential equation

Procedia PDF Downloads 96
346 The Students' Mathematical Competency and Attitude towards Mathematics Using the Trachtenberg Speed Math System

Authors: Marlone D. Severo

Abstract:

A pre- and post-test quasi-experimental design was used to test the intervention of Trachtenberg Speed Math on the mathematical competency of sixty (60) matched-paired students with a poor performing grade in Mathematics from one of the biggest public national high school at the South of Metro Manila. Both control and experimental group were administered with the Attitude Towards Mathematics Inventory (ATMI) before the pretest were given and both group showed high dislike for Mathematics. Pretest showed a 53 percent accuracy for the control group and 51 percent for the experimental group using a 15-item long multiplication test without any aid of a computing device. The experimental group were taught how to use the Trachtenberg number-keys and techniques in multiplication between October 2014 to March 2015. Post-test showed an improvement in the experimental group with 96 percent accuracy for the control group and a dismal 57 percent for the control group in long-multiplication. Post-test ATMI were administered. The control group showed a great dislike towards Mathematics, while the experimental group showed a positive attitude towards the subject.

Keywords: attitude towards mathematics, mathematical competency, number-keys, trachtenberg speed math

Procedia PDF Downloads 339
345 Future of Nanotechnology in Digital MacDraw

Authors: Pejman Hosseinioun, Abolghasem Ghasempour, Elham Gholami, Hamed Sarbazi

Abstract:

Considering the development in global semiconductor technology, it is anticipated that gadgets such as diodes and resonant transistor tunnels (RTD/RTT), Single electron transistors (SET) and quantum cellular automata (QCA) will substitute CMOS (Complementary Metallic Oxide Semiconductor) gadgets in many applications. Unfortunately, these new technologies cannot disembark the common Boolean logic efficiently and are only appropriate for liminal logic. Therefor there is no doubt that with the development of these new gadgets it is necessary to find new MacDraw technologies which are compatible with them. Resonant transistor tunnels (RTD/RTT) and circuit MacDraw with enhanced computing abilities are candida for accumulating Nano criterion in the future. Quantum cellular automata (QCA) are also advent Nano technological gadgets for electrical circuits. Advantages of these gadgets such as higher speed, smaller dimensions, and lower consumption loss are of great consideration. QCA are basic gadgets in manufacturing gates, fuses and memories. Regarding the complex Nano criterion physical entity, circuit designers can focus on logical and constructional design to decrease complication in MacDraw. Moreover Single electron technology (SET) is another noteworthy gadget considered in Nano technology. This article is a survey in future of Nano technology in digital MacDraw.

Keywords: nano technology, resonant transistor tunnels, quantum cellular automata, semiconductor

Procedia PDF Downloads 245
344 Service Life Modelling of Concrete Deterioration Due to Biogenic Sulphuric Acid (BSA) Attack-State-of-an-Art-Review

Authors: Ankur Bansal, Shashank Bishnoi

Abstract:

Degradation of Sewage pipes, sewage pumping station and Sewage treatment plants(STP) is of major concern due to difficulty in their maintenance and the high cost of replacement. Most of these systems undergo degradation due to Biogenic sulphuric acid (BSA) attack. Since most of Waste water treatment system are underground, detection of this deterioration remains hidden. This paper presents a literature review, outlining the mechanism of this attack focusing on critical parameters of BSA attack, along with available models and software to predict the deterioration due to this attack. This paper critically examines the various steps and equation in various Models of BSA degradation, detail on assumptions and working of different softwares are also highlighted in this paper. The paper also focuses on the service life design technique available through various codes and method to integrate the servile life design with BSA degradation on concrete. In the end, various methods enhancing the resistance of concrete against Biogenic sulphuric acid attack are highlighted. It may be concluded that the effective modelling for degradation phenomena may bring positive economical and environmental impacts. With current computing capabilities integrated degradation models combining the various durability aspects can bring positive change for sustainable society.

Keywords: concrete degradation, modelling, service life, sulphuric acid attack

Procedia PDF Downloads 288
343 Geomechanical Technologies for Assessing Three-Dimensional Stability of Underground Excavations Utilizing Remote-Sensing, Finite Element Analysis, and Scientific Visualization

Authors: Kwang Chun, John Kemeny

Abstract:

Light detection and ranging (LiDAR) has been a prevalent remote-sensing technology applied in the geological fields due to its high precision and ease of use. One of the major applications is to use the detailed geometrical information of underground structures as a basis for the generation of a three-dimensional numerical model that can be used in a geotechnical stability analysis such as FEM or DEM. To date, however, straightforward techniques in reconstructing the numerical model from the scanned data of the underground structures have not been well established or tested. In this paper, we propose a comprehensive approach integrating all the various processes, from LiDAR scanning to finite element numerical analysis. The study focuses on converting LiDAR 3D point clouds of geologic structures containing complex surface geometries into a finite element model. This methodology has been applied to Kartchner Caverns in Arizona, where detailed underground and surface point clouds can be used for the analysis of underground stability. Numerical simulations were performed using the finite element code Abaqus and presented by 3D computing visualization solution, ParaView. The results are useful in studying the stability of all types of underground excavations including underground mining and tunneling.

Keywords: finite element analysis, LiDAR, remote-sensing, scientific visualization, underground stability

Procedia PDF Downloads 142
342 A Method of Representing Knowledge of Toolkits in a Pervasive Toolroom Maintenance System

Authors: A. Mohamed Mydeen, Pallapa Venkataram

Abstract:

The learning process needs to be so pervasive to impart the quality in acquiring the knowledge about a subject by making use of the advancement in the field of information and communication systems. However, pervasive learning paradigms designed so far are system automation types and they lack in factual pervasive realm. Providing factual pervasive realm requires subtle ways of teaching and learning with system intelligence. Augmentation of intelligence with pervasive learning necessitates the most efficient way of representing knowledge for the system in order to give the right learning material to the learner. This paper presents a method of representing knowledge for Pervasive Toolroom Maintenance System (PTMS) in which a learner acquires sublime knowledge about the various kinds of tools kept in the toolroom and also helps for effective maintenance of the toolroom. First, we explicate the generic model of knowledge representation for PTMS. Second, we expound the knowledge representation for specific cases of toolkits in PTMS. We have also presented the conceptual view of knowledge representation using ontology for both generic and specific cases. Third, we have devised the relations for pervasive knowledge in PTMS. Finally, events are identified in PTMS which are then linked with pervasive data of toolkits based on relation formulated. The experimental environment and case studies show the accuracy and efficient knowledge representation of toolkits in PTMS.

Keywords: knowledge representation, pervasive computing, agent technology, ECA rules

Procedia PDF Downloads 308
341 Algorithms for Computing of Optimization Problems with a Common Minimum-Norm Fixed Point with Applications

Authors: Apirak Sombat, Teerapol Saleewong, Poom Kumam, Parin Chaipunya, Wiyada Kumam, Anantachai Padcharoen, Yeol Je Cho, Thana Sutthibutpong

Abstract:

This research is aimed to study a two-step iteration process defined over a finite family of σ-asymptotically quasi-nonexpansive nonself-mappings. The strong convergence is guaranteed under the framework of Banach spaces with some additional structural properties including strict and uniform convexity, reflexivity, and smoothness assumptions. With similar projection technique for nonself-mapping in Hilbert spaces, we hereby use the generalized projection to construct a point within the corresponding domain. Moreover, we have to introduce the use of duality mapping and its inverse to overcome the unavailability of duality representation that is exploit by Hilbert space theorists. We then apply our results for σ-asymptotically quasi-nonexpansive nonself-mappings to solve for ideal efficiency of vector optimization problems composed of finitely many objective functions. We also showed that the obtained solution from our process is the closest to the origin. Moreover, we also give an illustrative numerical example to support our results.

Keywords: asymptotically quasi-nonexpansive nonself-mapping, strong convergence, fixed point, uniformly convex and uniformly smooth Banach space

Procedia PDF Downloads 229
340 Effects of Front Porch and Loft on Indoor Ventilation in the Renewal of Beijing Courtyard

Authors: Zhongzhong Zeng, Zichen Liang

Abstract:

In recent years, Beijing courtyards have been facing the problem of renewal and renovation, and the residents are faced with the problems of small house areas, large household sizes, old and dangerous houses, etc. Among the many renovation methods, the authors note two more common practices of using the front porch to expand the floor area and adding a loft. Residents and architects, however, did not give the ventilation performance of the significant interior consideration before beginning the remodeling. The aim of this article is to explore the good or negative impacts of both front porch and loft structures on the manner of interior ventilation in the courtyard. Ventilation, in turn, is crucial to the indoor environmental quality of a home. The major method utilized in this study is the comparative analysis method, in which the authors create four alternative house models with or without a front porch and an attic as two variables and examine internal ventilation using the CFD(Computational Fluid Dynamics) technique. The authors compare the indoor ventilation of four different architectural models with or without front porches and lofts as two variables. The results obtained from the analysis of the sectional airflow and the plane 1.5m height cloud are the existence of the loft, to a certain extent, disrupts the airflow organization of the building and makes the rear wall high windows of the building less effective. Occupying the front porch to become the area of the house has no significant effect on ventilation, but try not to occupy the front porch and add the loft at the same time in the building renovation. The findings of this study led to the following recommendations: strive to preserve the courtyard building's original architectural design and make adjustments to only the inappropriate elements or constructions. The ventilation in the loft portion is inadequate, and the inhabitants typically use the loft as a living area. This may lead to the building relying more on air conditioning in the summer, which would raise energy demand. The front porch serves as a transition place as well as a source of shade, weather protection, and inside ventilation. In conclusion, the examination of interior environments in upcoming studies should concentrate on cross-disciplinary, multi-angle, and multi-level research topics.

Keywords: Beijing courtyard renewal, CFD, indoor environment, ventilation analysis

Procedia PDF Downloads 59
339 Logic Programming and Artificial Neural Networks in Pharmacological Screening of Schinus Essential Oils

Authors: José Neves, M. Rosário Martins, Fátima Candeias, Diana Ferreira, Sílvia Arantes, Júlio Cruz-Morais, Guida Gomes, Joaquim Macedo, António Abelha, Henrique Vicente

Abstract:

Some plants of genus Schinus have been used in the folk medicine as topical antiseptic, digestive, purgative, diuretic, analgesic or antidepressant, and also for respiratory and urinary infections. Chemical composition of essential oils of S. molle and S. terebinthifolius had been evaluated and presented high variability according with the part of the plant studied and with the geographic and climatic regions. The pharmacological properties, namely antimicrobial, anti-tumoural and anti-inflammatory activities are conditioned by chemical composition of essential oils. Taking into account the difficulty to infer the pharmacological properties of Schinus essential oils without hard experimental approach, this work will focus on the development of a decision support system, in terms of its knowledge representation and reasoning procedures, under a formal framework based on Logic Programming, complemented with an approach to computing centered on Artificial Neural Networks and the respective Degree-of-Confidence that one has on such an occurrence.

Keywords: artificial neuronal networks, essential oils, knowledge representation and reasoning, logic programming, Schinus molle L., Schinus terebinthifolius Raddi

Procedia PDF Downloads 520
338 Survey of Methods for Solutions of Spatial Covariance Structures and Their Limitations

Authors: Joseph Thomas Eghwerido, Julian I. Mbegbu

Abstract:

In modelling environment processes, we apply multidisciplinary knowledge to explain, explore and predict the Earth's response to natural human-induced environmental changes. Thus, the analysis of spatial-time ecological and environmental studies, the spatial parameters of interest are always heterogeneous. This often negates the assumption of stationarity. Hence, the dispersion of the transportation of atmospheric pollutants, landscape or topographic effect, weather patterns depends on a good estimate of spatial covariance. The generalized linear mixed model, although linear in the expected value parameters, its likelihood varies nonlinearly as a function of the covariance parameters. As a consequence, computing estimates for a linear mixed model requires the iterative solution of a system of simultaneous nonlinear equations. In other to predict the variables at unsampled locations, we need to know the estimate of the present sampled variables. The geostatistical methods for solving this spatial problem assume covariance stationarity (locally defined covariance) and uniform in space; which is not apparently valid because spatial processes often exhibit nonstationary covariance. Hence, they have globally defined covariance. We shall consider different existing methods of solutions of spatial covariance of a space-time processes at unsampled locations. This stationary covariance changes with locations for multiple time set with some asymptotic properties.

Keywords: parametric, nonstationary, Kernel, Kriging

Procedia PDF Downloads 232
337 Understanding Hydrodynamic in Lake Victoria Basin in a Catchment Scale: A Literature Review

Authors: Seema Paul, John Mango Magero, Prosun Bhattacharya, Zahra Kalantari, Steve W. Lyon

Abstract:

The purpose of this review paper is to develop an understanding of lake hydrodynamics and the potential climate impact on the Lake Victoria (LV) catchment scale. This paper briefly discusses the main problems of lake hydrodynamics and its’ solutions that are related to quality assessment and climate effect. An empirical methodology in modeling and mapping have considered for understanding lake hydrodynamic and visualizing the long-term observational daily, monthly, and yearly mean dataset results by using geographical information system (GIS) and Comsol techniques. Data were obtained for the whole lake and five different meteorological stations, and several geoprocessing tools with spatial analysis are considered to produce results. The linear regression analyses were developed to build climate scenarios and a linear trend on lake rainfall data for a long period. A potential evapotranspiration rate has been described by the MODIS and the Thornthwaite method. The rainfall effect on lake water level observed by Partial Differential Equations (PDE), and water quality has manifested by a few nutrients parameters. The study revealed monthly and yearly rainfall varies with monthly and yearly maximum and minimum temperatures, and the rainfall is high during cool years and the temperature is high associated with below and average rainfall patterns. Rising temperatures are likely to accelerate evapotranspiration rates and more evapotranspiration is likely to lead to more rainfall, drought is more correlated with temperature and cloud is more correlated with rainfall. There is a trend in lake rainfall and long-time rainfall on the lake water surface has affected the lake level. The onshore and offshore have been concentrated by initial literature nutrients data. The study recommended that further studies should consider fully lake bathymetry development with flow analysis and its’ water balance, hydro-meteorological processes, solute transport, wind hydrodynamics, pollution and eutrophication these are crucial for lake water quality, climate impact assessment, and water sustainability.

Keywords: climograph, climate scenarios, evapotranspiration, linear trend flow, rainfall event on LV, concentration

Procedia PDF Downloads 67
336 myITLab as an Implementation Instance of Distance Education Technologies

Authors: Leila Goosen

Abstract:

The research problem reported on in this paper relates to improving success in Computer Science and Information Technology subjects where students are learning applications, especially when teaching occurs in a distance education context. An investigation was launched in order to address students’ struggles with applications, and improve their assessment in such subjects. Some of the main arguments presented centre on formulating and situating significant concepts within an appropriate conceptual framework. The paper explores the experiences and perceptions of computing instructors, teaching assistants, students and higher education institutions on how they are empowered by using technologies such as myITLab. They also share how they are working with the available features to successfully teach applications to their students. The data collection methodology used is then described. The paper includes discussions on how myITLab empowers instructors, teaching assistants, students and higher education institutions. Conclusions are presented on the way in which this paper could make an original and significant contribution to the promotion and development of knowledge in fields related to successfully teaching applications for student learning, including in a distance education context. The paper thus provides a forum for practitioners to highlight and discuss insights and successes, as well as identify new technical and organisational challenges, lessons and concerns regarding practical activities related to myITLab as an implementation instance of distance education technologies.

Keywords: distance, education, myITLab, technologies

Procedia PDF Downloads 342
335 A Proposed Model of E-Marketing Service-Oriented Architecture (E-MSOA)

Authors: Hussein Moselhy, Islam Salam

Abstract:

There have been some challenges and problems which hinder the implementation of the e-marketing systems such as the high cost of information systems infrastructure and maintenance as well as their unavailability within the institution. Also, there is no system which supports all programming languages and different platforms. Another problem is the lack of integration between these systems on one hand and the operating systems and different web browsers on the other hand. No system for customer relationship management is established which recognizes their desires and puts them in consideration while performing e-marketing functions is available. Therefore, the service-oriented architecture emerged as one of the most important techniques and methodologies to build systems that integrate with various operating systems and different platforms and other technologies. This technology allows realizing the data exchange among different applications. The service-oriented architecture represents distributed computing concepts to demonstrate its success in achieving the requirements of systems through web services. It also reflects the appropriate design for the services to use different web services in supporting the requirements of business processes and software users. In a service-oriented environment, web services are deployed on the web in the form of independent services to be accessed without knowledge of the nature of the programs and systems with in. This Paper presents a proposal for a new model which contributes to the application of methods and means of e-marketing with the integration of marketing mix elements to improve marketing efficiency (E-MSOA). And apply it in the educational city of one of the Egyptian sector.

Keywords: service-oriented architecture, electronic commerce, virtual retailing, unified modeling language

Procedia PDF Downloads 404
334 Psychological Well-Being and Human Rights of Teenage Mothers Attending One Secondary School in the Eastern Cape, South Africa

Authors: Veliswa Nonfundo Hoho, Jabulani Gilford Kheswa

Abstract:

This paper reports on teenage motherhood and its adverse outcomes on the academic performance, emotional well-being and sexual relationships that adolescent females encounter. Drawing from Ryff’s six dimensions of psychological well-being and Bronfenbrenner’s ecological model which underpinned this study, teenage motherhood has been found to link with multiple factors such as poverty, negative self-esteem, substance abuse, cohabitation, intimate partner violence and ill-health. Furthermore, research indicates that in schools where educators fail to perform their duties as loco-parentis to motivate adolescent females learners who are mothers, absenteeism, poor academic performance and learned helplessness, are likely. The aim of this research was two-fold, namely; (i) to determine the impact of teenage motherhood on the psychological well-being of the teenage mothers and (ii) to investigate the policies which protect the human rights of teenage mothers attending secondary schools. In a qualitative study conducted in one secondary school, Fort Beaufort, Eastern Cape, South Africa, fifteen Xhosa-speaking teenage mothers, aged 15-18 years old, were interviewed. The sample was recruited by means of snow-ball sampling. To safeguard the human dignity of the respondents, informed consent, confidentiality, anonymity and privacy of the respondents were assured. For trustworthiness, this research ensured that credibility, neutrality, and transferability, are met. Following an axial and open coding of responses, five themes were identified; Health issues of teenage mothers, lack of support, violation of human rights, impaired sense of purpose in life and intimate partner-violence. From these findings, it is clear that teenage mothers lack resilience and are susceptible to contract sexually transmitted infections and HIV/AIDS because they are submissive and hopeless. Furthermore, owing to stigma that the teenage mothers' experience from family members, they resort to alcohol and drug abuse, and feel demotivated to bond with their babies. In conclusion, the recommendations are that the Health and Social Development departments collaborate to empower the psychological well-being of teenage mothers. Furthermore, school policies on discrimination should be enacted and consistently implemented.

Keywords: depression, discrimination, self-esteem, teenage mothers

Procedia PDF Downloads 253
333 Experiences of Timing Analysis of Parallel Embedded Software

Authors: Muhammad Waqar Aziz, Syed Abdul Baqi Shah

Abstract:

The execution time analysis is fundamental to the successful design and execution of real-time embedded software. In such analysis, the Worst-Case Execution Time (WCET) of a program is a key measure, on the basis of which system tasks are scheduled. The WCET analysis of embedded software is also needed for system understanding and to guarantee its behavior. WCET analysis can be performed statically (without executing the program) or dynamically (through measurement). Traditionally, research on the WCET analysis assumes sequential code running on single-core platforms. However, as computation is steadily moving towards using a combination of parallel programs and multi-core hardware, new challenges in WCET analysis need to be addressed. In this article, we report our experiences of performing the WCET analysis of Parallel Embedded Software (PES) running on multi-core platform. The primary purpose was to investigate how WCET estimates of PES can be computed statically, and how they can be derived dynamically. Our experiences, as reported in this article, include the challenges we faced, possible suggestions to these challenges and the workarounds that were developed. This article also provides observations on the benefits and drawbacks of deriving the WCET estimates using the said methods and provides useful recommendations for further research in this area.

Keywords: embedded software, worst-case execution-time analysis, static flow analysis, measurement-based analysis, parallel computing

Procedia PDF Downloads 300
332 Applicability of Fuzzy Logic for Intrusion Detection in Mobile Adhoc Networks

Authors: Ruchi Makani, B. V. R. Reddy

Abstract:

Mobile Adhoc Networks (MANETs) are gaining popularity due to their potential of providing low-cost mobile connectivity solutions to real-world communication problems. Integrating Intrusion Detection Systems (IDS) in MANETs is a tedious task by reason of its distinctive features such as dynamic topology, de-centralized authority and highly controlled/limited resource environment. IDS primarily use automated soft-computing techniques to monitor the inflow/outflow of traffic packets in a given network to detect intrusion. Use of machine learning techniques in IDS enables system to make decisions on intrusion while continuous keep learning about their dynamic environment. An appropriate IDS model is essential to be selected to expedite this application challenges. Thus, this paper focused on fuzzy-logic based machine learning IDS technique for MANETs and presented their applicability for achieving effectiveness in identifying the intrusions. Further, the selection of appropriate protocol attributes and fuzzy rules generation plays significant role for accuracy of the fuzzy-logic based IDS, have been discussed. This paper also presents the critical attributes of MANET’s routing protocol and its applicability in fuzzy logic based IDS.

Keywords: AODV, mobile adhoc networks, intrusion detection, anomaly detection, fuzzy logic, fuzzy membership function, fuzzy inference system

Procedia PDF Downloads 150
331 Adapting Liability in the Era of Automated Decision-Making: A South African Labour Law Perspective

Authors: Aisha Adam

Abstract:

This study critically examines the transformative impact of automated decision-making (ADM) and artificial intelligence (AI) systems on South African labour law. As AI technologies increasingly infiltrate workplaces, existing liability frameworks face challenges in addressing the unique complexities presented by these innovations. This article explores the necessity of redefining liability to accommodate the nuanced landscape of ADM and AI within South African labour law. It emphasises the importance of ensuring responsible deployment and safeguarding the rights of workers amid evolving technological dynamics. This research investigates the central concern of fairness, bias, and discrimination in ADM and AI decision-making. Focusing on algorithmic bias and discriminatory outcomes, the paper advocates for the integration of mechanisms within the South African legal framework, particularly under the Promotion of Equality and Prevention of Unfair Discrimination Act (PEPUDA) and the Employment Equity Act (EEA). The study scrutinises the shifting dynamics of the employment relationship, calling for clear guidelines on the responsibilities and liabilities of employers, employees, and technology providers. Furthermore, the article analyses legal and policy responses to ADM and AI within South African labour law, exploring potential amendments to legislation, guidelines, and codes of practice. It assesses the role of regulatory bodies, specifically the Commission for Conciliation, Mediation, and Arbitration (CCMA), in overseeing and enforcing responsible practices in the workplace. Lastly, the research evaluates the impact of ADM and AI on human and social rights in the South African context. Emphasising the protection of constitutional rights, including fair labour practices, privacy, and equality, the study proposes remedies and safeguards. It advocates for a multidisciplinary approach involving legal, technological, and ethical considerations to redefine liability in South African labour law effectively. The article contends that a shift from accountability to responsibility is crucial for promoting fairness, antidiscrimination, and the protection of human and social rights in the age of automated decision-making. It calls for collaborative efforts among stakeholders to shape responsible practices and redefine liability in this evolving technological landscape.

Keywords: automated decision-making, artificial intelligence, labour law, vicarious liability

Procedia PDF Downloads 49
330 Predicting Daily Patient Hospital Visits Using Machine Learning

Authors: Shreya Goyal

Abstract:

The study aims to build user-friendly software to understand patient arrival patterns and compute the number of potential patients who will visit a particular health facility for a given period by using a machine learning algorithm. The underlying machine learning algorithm used in this study is the Support Vector Machine (SVM). Accurate prediction of patient arrival allows hospitals to operate more effectively, providing timely and efficient care while optimizing resources and improving patient experience. It allows for better allocation of staff, equipment, and other resources. If there's a projected surge in patients, additional staff or resources can be allocated to handle the influx, preventing bottlenecks or delays in care. Understanding patient arrival patterns can also help streamline processes to minimize waiting times for patients and ensure timely access to care for patients in need. Another big advantage of using this software is adhering to strict data protection regulations such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States as the hospital will not have to share the data with any third party or upload it to the cloud because the software can read data locally from the machine. The data needs to be arranged in. a particular format and the software will be able to read the data and provide meaningful output. Using software that operates locally can facilitate compliance with these regulations by minimizing data exposure. Keeping patient data within the hospital's local systems reduces the risk of unauthorized access or breaches associated with transmitting data over networks or storing it in external servers. This can help maintain the confidentiality and integrity of sensitive patient information. Historical patient data is used in this study. The input variables used to train the model include patient age, time of day, day of the week, seasonal variations, and local events. The algorithm uses a Supervised learning method to optimize the objective function and find the global minima. The algorithm stores the values of the local minima after each iteration and at the end compares all the local minima to find the global minima. The strength of this study is the transfer function used to calculate the number of patients. The model has an output accuracy of >95%. The method proposed in this study could be used for better management planning of personnel and medical resources.

Keywords: machine learning, SVM, HIPAA, data

Procedia PDF Downloads 47
329 Analyzing the Risk Based Approach in General Data Protection Regulation: Basic Challenges Connected with Adapting the Regulation

Authors: Natalia Kalinowska

Abstract:

The adoption of the General Data Protection Regulation, (GDPR) finished the four-year work of the European Commission in this area in the European Union. Considering far-reaching changes, which will be applied by GDPR, the European legislator envisaged two-year transitional period. Member states and companies have to prepare for a new regulation until 25 of May 2018. The idea, which becomes a new look at an attitude to data protection in the European Union is risk-based approach. So far, as a result of implementation of Directive 95/46/WE, in many European countries (including Poland) there have been adopted very particular regulations, specifying technical and organisational security measures e.g. Polish implementing rules indicate even how long password should be. According to the new approach from May 2018, controllers and processors will be obliged to apply security measures adequate to level of risk associated with specific data processing. The risk in GDPR should be interpreted as the likelihood of a breach of the rights and freedoms of the data subject. According to Recital 76, the likelihood and severity of the risk to the rights and freedoms of the data subject should be determined by reference to the nature, scope, context and purposes of the processing. GDPR does not indicate security measures which should be applied – in recitals there are only examples such as anonymization or encryption. It depends on a controller’s decision what type of security measures controller considered as sufficient and he will be responsible if these measures are not sufficient or if his identification of risk level is incorrect. Data protection regulation indicates few levels of risk. Recital 76 indicates risk and high risk, but some lawyers think, that there is one more category – low risk/now risk. Low risk/now risk data processing is a situation when it is unlikely to result in a risk to the rights and freedoms of natural persons. GDPR mentions types of data processing when a controller does not have to evaluate level of risk because it has been classified as „high risk” processing e.g. processing on a large scale of special categories of data, processing with using new technologies. The methodology will include analysis of legal regulations e.g. GDPR, the Polish Act on the Protection of personal data. Moreover: ICO Guidelines and articles concerning risk based approach in GDPR. The main conclusion is that an appropriate risk assessment is a key to keeping data safe and avoiding financial penalties. On the one hand, this approach seems to be more equitable, not only for controllers or processors but also for data subjects, but on the other hand, it increases controllers’ uncertainties in the assessment which could have a direct impact on incorrect data protection and potential responsibility for infringement of regulation.

Keywords: general data protection regulation, personal data protection, privacy protection, risk based approach

Procedia PDF Downloads 227
328 Unlocking Health Insights: Studying Data for Better Care

Authors: Valentina Marutyan

Abstract:

Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.

Keywords: data mining, healthcare, big data, large amounts of data

Procedia PDF Downloads 38
327 The Ethics Of Documentary Filmmaking Discuss The Ethical Considerations And Responsibilities Of Documentary Filmmakers When Portraying Real-Life Events And Subjects

Authors: Batatunde Kolawole

Abstract:

Documentary filmmaking stands as a distinctive medium within the cinematic realm, commanding a unique responsibility the portrayal of real-life events and subjects. This research delves into the profound ethical considerations and responsibilities that documentary filmmakers shoulder as they embark on the quest to unveil truth and weave compelling narratives. In the exploration, they embark on a comprehensive review of ethical frameworks and real-world case studies, illuminating the intricate web of challenges that documentarians confront. These challenges encompass an array of ethical intricacies, from securing informed consent to safeguarding privacy, maintaining unwavering objectivity, and sidestepping the snares of narrative manipulation when crafting stories from reality. Furthermore, they dissect the contemporary ethical terrain, acknowledging the emergence of novel dilemmas in the digital age, such as deepfakes and digital alterations. Through a meticulous analysis of ethical quandaries faced by distinguished documentary filmmakers and their strategies for ethical navigation, this study offers invaluable insights into the evolving role of documentaries in molding public discourse. They underscore the indispensable significance of transparency, integrity, and an indomitable commitment to encapsulating the intricacies of reality within the realm of ethical documentary filmmaking. In a world increasingly reliant on visual narratives, an understanding of the subtle ethical dimensions of documentary filmmaking holds relevance not only for those behind the camera but also for the diverse audiences who engage with and interpret the realities unveiled on screen. This research stands as a rigorous examination of the moral compass that steers this potent form of cinematic expression. It emphasizes the capacity of ethical documentary filmmaking to enlighten, challenge, and inspire, all while unwaveringly upholding the core principles of truthfulness and respect for the human subjects under scrutiny. Through this holistic analysis, they illuminate the enduring significance of upholding ethical integrity while uncovering the truths that shape our world. Ethical documentary filmmaking, as exemplified by "Rape" and countless other powerful narratives, serves as a testament to the enduring potential of cinema to inform, challenge, and drive meaningful societal discourse.

Keywords: filmmaking, documentary, human right, film

Procedia PDF Downloads 39
326 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination

Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan

Abstract:

The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.

Keywords: confidence interval, handwriting, kernel density estimator, KDE, logistic regression LoR, repeatability, reproducibility

Procedia PDF Downloads 97
325 Culture Dimensions of Information Systems Security in Saudi Arabia National Health Services

Authors: Saleh Alumaran, Giampaolo Bella, Feng Chen

Abstract:

The study of organisations’ information security cultures has attracted scholars as well as healthcare services industry to research the topic and find appropriate tools and approaches to develop a positive culture. The vast majority of studies in Saudi national health services are on the use of technology to protect and secure health services information. On the other hand, there is a lack of research on the role and impact of an organisation’s cultural dimensions on information security. This research investigated and analysed the role and impact of cultural dimensions on information security in Saudi Arabia health service. Hypotheses were tested and two surveys were carried out in order to collect data and information from three major hospitals in Saudi Arabia (SA). The first survey identified the main cultural-dimension problems in SA health services and developed an initial information security culture framework model. The second survey evaluated and tested the developed framework model to test its usefulness, reliability and applicability. The model is based on human behaviour theory, where the individual’s attitude is the key element of the individual’s intention to behave as well as of his or her actual behaviour. The research identified six cultural dimensions: Saudi national culture, Saudi health service leadership, employees’ trust, technology, multicultural interactions and employees’ job roles. The research also identified a set of cultural sub-dimensions. These include working values and norms, tribe values and norms, attitudes towards women, power sharing, vision, social interaction, respect and understanding, hospital intra-net, hospital employees’ language(s) used, multi-national culture, communication system, employees’ job satisfaction and job security. The research identified that (a) the human behaviour towards medical information in SA is one of the main threats to information security and one of the main challenges to SA health authority, (b) The current situation of SA hospitals’ IS cultures is falling short in protecting medical information due to the current value and norms towards information security, (c) Saudi national culture and employees’ job role are the main dimensions playing major roles in the employees’ attitude, and technology is the least important dimension playing a role in the employees’ attitudes.

Keywords: cultural dimension, electronic health record, information security, privacy

Procedia PDF Downloads 332
324 Treating Voxels as Words: Word-to-Vector Methods for fMRI Meta-Analyses

Authors: Matthew Baucum

Abstract:

With the increasing popularity of fMRI as an experimental method, psychology and neuroscience can greatly benefit from advanced techniques for summarizing and synthesizing large amounts of data from brain imaging studies. One promising avenue is automated meta-analyses, in which natural language processing methods are used to identify the brain regions consistently associated with certain semantic concepts (e.g. “social”, “reward’) across large corpora of studies. This study builds on this approach by demonstrating how, in fMRI meta-analyses, individual voxels can be treated as vectors in a semantic space and evaluated for their “proximity” to terms of interest. In this technique, a low-dimensional semantic space is built from brain imaging study texts, allowing words in each text to be represented as vectors (where words that frequently appear together are near each other in the semantic space). Consequently, each voxel in a brain mask can be represented as a normalized vector sum of all of the words in the studies that showed activation in that voxel. The entire brain mask can then be visualized in terms of each voxel’s proximity to a given term of interest (e.g., “vision”, “decision making”) or collection of terms (e.g., “theory of mind”, “social”, “agent”), as measured by the cosine similarity between the voxel’s vector and the term vector (or the average of multiple term vectors). Analysis can also proceed in the opposite direction, allowing word cloud visualizations of the nearest semantic neighbors for a given brain region. This approach allows for continuous, fine-grained metrics of voxel-term associations, and relies on state-of-the-art “open vocabulary” methods that go beyond mere word-counts. An analysis of over 11,000 neuroimaging studies from an existing meta-analytic fMRI database demonstrates that this technique can be used to recover known neural bases for multiple psychological functions, suggesting this method’s utility for efficient, high-level meta-analyses of localized brain function. While automated text analytic methods are no replacement for deliberate, manual meta-analyses, they seem to show promise for the efficient aggregation of large bodies of scientific knowledge, at least on a relatively general level.

Keywords: FMRI, machine learning, meta-analysis, text analysis

Procedia PDF Downloads 424
323 Detection of Powdery Mildew Disease in Strawberry Using Image Texture and Supervised Classifiers

Authors: Sultan Mahmud, Qamar Zaman, Travis Esau, Young Chang

Abstract:

Strawberry powdery mildew (PM) is a serious disease that has a significant impact on strawberry production. Field scouting is still a major way to find PM disease, which is not only labor intensive but also almost impossible to monitor disease severity. To reduce the loss caused by PM disease and achieve faster automatic detection of the disease, this paper proposes an approach for detection of the disease, based on image texture and classified with support vector machines (SVMs) and k-nearest neighbors (kNNs). The methodology of the proposed study is based on image processing which is composed of five main steps including image acquisition, pre-processing, segmentation, features extraction and classification. Two strawberry fields were used in this study. Images of healthy leaves and leaves infected with PM (Sphaerotheca macularis) disease under artificial cloud lighting condition. Colour thresholding was utilized to segment all images before textural analysis. Colour co-occurrence matrix (CCM) was introduced for extraction of textural features. Forty textural features, related to a physiological parameter of leaves were extracted from CCM of National television system committee (NTSC) luminance, hue, saturation and intensity (HSI) images. The normalized feature data were utilized for training and validation, respectively, using developed classifiers. The classifiers have experimented with internal, external and cross-validations. The best classifier was selected based on their performance and accuracy. Experimental results suggested that SVMs classifier showed 98.33%, 85.33%, 87.33%, 93.33% and 95.0% of accuracy on internal, external-I, external-II, 4-fold cross and 5-fold cross-validation, respectively. Whereas, kNNs results represented 90.0%, 72.00%, 74.66%, 89.33% and 90.3% of classification accuracy, respectively. The outcome of this study demonstrated that SVMs classified PM disease with a highest overall accuracy of 91.86% and 1.1211 seconds of processing time. Therefore, overall results concluded that the proposed study can significantly support an accurate and automatic identification and recognition of strawberry PM disease with SVMs classifier.

Keywords: powdery mildew, image processing, textural analysis, color co-occurrence matrix, support vector machines, k-nearest neighbors

Procedia PDF Downloads 101
322 An Approximate Formula for Calculating the Fundamental Mode Period of Vibration of Practical Building

Authors: Abdul Hakim Chikho

Abstract:

Most international codes allow the use of an equivalent lateral load method for designing practical buildings to withstand earthquake actions. This method requires calculating an approximation to the fundamental mode period of vibrations of these buildings. Several empirical equations have been suggested to calculate approximations to the fundamental periods of different types of structures. Most of these equations are knowing to provide an only crude approximation to the required fundamental periods and repeating the calculation utilizing a more accurate formula is usually required. In this paper, a new formula to calculate a satisfactory approximation of the fundamental period of a practical building is proposed. This formula takes into account the mass and the stiffness of the building therefore, it is more logical than the conventional empirical equations. In order to verify the accuracy of the proposed formula, several examples have been solved. In these examples, calculating the fundamental mode periods of several farmed buildings utilizing the proposed formula and the conventional empirical equations has been accomplished. Comparing the obtained results with those obtained from a dynamic computer has shown that the proposed formula provides a more accurate estimation of the fundamental periods of practical buildings. Since the proposed method is still simple to use and requires only a minimum computing effort, it is believed to be ideally suited for design purposes.

Keywords: earthquake, fundamental mode period, design, building

Procedia PDF Downloads 259