Search results for: vector information
10660 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton
Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani
Abstract:
Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton
Procedia PDF Downloads 32610659 Comparison of the Logistic and the Gompertz Growth Functions Considering a Periodic Perturbation in the Model Parameters
Authors: Avan Al-Saffar, Eun-Jin Kim
Abstract:
Both the logistic growth model and the gompertz growth model are used to describe growth processes. Both models driven by perturbations in different cases are investigated using information theory as a useful measure of sustainability and the variability. Specifically, we study the effect of different oscillatory modulations in the system's parameters on the evolution of the system and Probability Density Function (PDF). We show the maintenance of the initial conditions for a long time. We offer Fisher information analysis in positive and/or negative feedback and explain its implications for the sustainability of population dynamics. We also display a finite amplitude solution due to the purely fluctuating growth rate whereas the periodic fluctuations in negative feedback can lead to break down the system's self-regulation with an exponentially growing solution. In the cases tested, the gompertz and logistic systems show similar behaviour in terms of information and sustainability although they develop differently in time.Keywords: dynamical systems, fisher information, probability density function (pdf), sustainability
Procedia PDF Downloads 43310658 Econometric Analysis of West African Countries’ Container Terminal Throughput and Gross Domestic Products
Authors: Kehinde Peter Oyeduntan, Kayode Oshinubi
Abstract:
The west African ports have been experiencing large inflow and outflow of containerized cargo in the last decades, and this has created a quest amongst the countries to attain the status of hub port for the sub-region. This study analyzed the relationship between the container throughput and Gross Domestic Products (GDP) of nine west African countries, using Simple Linear Regression (SLR), Polynomial Regression Model (PRM) and Support Vector Machines (SVM) with a time series of 20 years. The results showed that there exists a high correlation between the GDP and container throughput. The model also predicted the container throughput in west Africa for the next 20 years. The findings and recommendations presented in this research will guide policy makers and help improve the management of container ports and terminals in west Africa, thereby boosting the economy.Keywords: container, ports, terminals, throughput
Procedia PDF Downloads 21510657 An Optimal Control Model for the Dynamics of Visceral Leishmaniasis
Authors: Ibrahim M. Elmojtaba, Rayan M. Altayeb
Abstract:
Visceral leishmaniasis (VL) is a vector-borne disease caused by the protozoa parasite of the genus leishmania. The transmission of the parasite to humans and animals occurs via the bite of adult female sandflies previously infected by biting and sucking blood of an infectious humans or animals. In this paper we use a previously proposed model, and then applied two optimal controls, namely treatment and vaccination to that model to investigate optimal strategies for controlling the spread of the disease using treatment and vaccination as the system control variables. The possible impact of using combinations of the two controls, either one at a time or two at a time on the spread of the disease is also examined. Our results provide a framework for vaccination and treatment strategies to reduce susceptible and infection individuals of VL in five years.Keywords: visceral leishmaniasis, treatment, vaccination, optimal control, numerical simulation
Procedia PDF Downloads 40410656 Road Vehicle Recognition Using Magnetic Sensing Feature Extraction and Classification
Authors: Xiao Chen, Xiaoying Kong, Min Xu
Abstract:
This paper presents a road vehicle detection approach for the intelligent transportation system. This approach mainly uses low-cost magnetic sensor and associated data collection system to collect magnetic signals. This system can measure the magnetic field changing, and it also can detect and count vehicles. We extend Mel Frequency Cepstral Coefficients to analyze vehicle magnetic signals. Vehicle type features are extracted using representation of cepstrum, frame energy, and gap cepstrum of magnetic signals. We design a 2-dimensional map algorithm using Vector Quantization to classify vehicle magnetic features to four typical types of vehicles in Australian suburbs: sedan, VAN, truck, and bus. Experiments results show that our approach achieves a high level of accuracy for vehicle detection and classification.Keywords: vehicle classification, signal processing, road traffic model, magnetic sensing
Procedia PDF Downloads 32010655 A Qualitative Exploration of How Brazilian Immigrant Mothers Living in the United States Obtain Information about Physical Activity and Screen-Viewing for Their Young Children
Authors: Ana Cristina Lindsay, Mary L. Greaney
Abstract:
Background: Racial/ethnic minority children of low-income immigrant families remain at increased risk of obesity. Consistent with high rates of childhood obesity among racial/ethnic minority children are high rates of physical inactivity and increased levels of sedentary behaviors (e.g., TV and other screen viewing). Brazilians comprise a fast-growing immigrant population group in the US, yet little research has focused on the health issues affecting Brazilian immigrant children. The purpose of this qualitative study was to explore how Brazilian-born immigrant mothers living in the United States obtain information about physical activity and screen-time for their young children. Methods: Qualitative research including focus groups with Brazilian immigrant mothers of preschool-age children living in the U.S. Results: Results revealed that Brazilian immigrant mothers obtain information on young children’s physical activity and screen-time from a variety of sources including interpersonal communication, television and magazines, government health care programs (WIC program) and professionals (e.g., nurses and pediatricians). A noteworthy finding is the significant role of foreign information sources (Brazilian TV shows and magazines) on mothers’ access to information about these early behaviors. Future research is needed to quantify and better understanding Brazilian parents’ access to accurate and sound information related to young children’s physical activity and screen-viewing behaviors. Conclusions: To our knowledge, no existing research has examined how Brazilian immigrant mothers living in the United States obtain information about these behaviors. This information is crucial for the design of culturally appropriate early childhood obesity prevention interventions tailored to the specific needs of this ethnic group.Keywords: physical activity, scree-time, information, immigrant, mothers, Brazilian, United States
Procedia PDF Downloads 27510654 The Impact of Malicious Attacks on the Performance of Routing Protocols in Mobile Ad-Hoc Networks
Authors: Habib Gorine, Rabia Saleh
Abstract:
Mobile Ad-Hoc Networks are the special type of wireless networks which share common security requirements with other networks such as confidentiality, integrity, authentication, and availability, which need to be addressed in order to secure data transfer through the network. Their routing protocols are vulnerable to various malicious attacks which could have a devastating consequence on data security. In this paper, three types of attacks such as selfish, gray hole, and black hole attacks have been applied to the two most important routing protocols in MANET named dynamic source routing and ad-hoc on demand distance vector in order to analyse and compare the impact of these attacks on the Network performance in terms of throughput, average delay, packet loss, and consumption of energy using NS2 simulator.Keywords: MANET, wireless networks, routing protocols, malicious attacks, wireless networks simulation
Procedia PDF Downloads 32110653 Information Theoretic Approach for Beamforming in Wireless Communications
Authors: Syed Khurram Mahmud, Athar Naveed, Shoaib Arif
Abstract:
Beamforming is a signal processing technique extensively utilized in wireless communications and radars for desired signal intensification and interference signal minimization through spatial selectivity. In this paper, we present a method for calculation of optimal weight vectors for smart antenna array, to achieve a directive pattern during transmission and selective reception in interference prone environment. In proposed scheme, Mutual Information (MI) extrema are evaluated through an energy constrained objective function, which is based on a-priori information of interference source and desired array factor. Signal to Interference plus Noise Ratio (SINR) performance is evaluated for both transmission and reception. In our scheme, MI is presented as an index to identify trade-off between information gain, SINR, illumination time and spatial selectivity in an energy constrained optimization problem. The employed method yields lesser computational complexity, which is presented through comparative analysis with conventional methods in vogue. MI based beamforming offers enhancement of signal integrity in degraded environment while reducing computational intricacy and correlating key performance indicators.Keywords: beamforming, interference, mutual information, wireless communications
Procedia PDF Downloads 28110652 Post-Earthquake Road Damage Detection by SVM Classification from Quickbird Satellite Images
Authors: Moein Izadi, Ali Mohammadzadeh
Abstract:
Detection of damaged parts of roads after earthquake is essential for coordinating rescuers. In this study, an approach is presented for the semi-automatic detection of damaged roads in a city using pre-event vector maps and both pre- and post-earthquake QuickBird satellite images. Damage is defined in this study as the debris of damaged buildings adjacent to the roads. Some spectral and texture features are considered for SVM classification step to detect damages. Finally, the proposed method is tested on QuickBird pan-sharpened images from the Bam City earthquake and the results show that an overall accuracy of 81% and a kappa coefficient of 0.71 are achieved for the damage detection. The obtained results indicate the efficiency and accuracy of the proposed approach.Keywords: SVM classifier, disaster management, road damage detection, quickBird images
Procedia PDF Downloads 62310651 A Conceptual Framework for Assessing the Development of Health Information Systems Enterprise Architecture Interoperability
Authors: Prosper Tafadzwa Denhere, Ephias Ruhode, Munyaradzi Zhou
Abstract:
Health Information Systems (HISs) interoperability is emerging to be the future of modern healthcare systems Enterprise Architecture (EA), where healthcare entities are seamlessly interconnected to share healthcare data. The reality that the healthcare industry has been characterised by an influx of fragmented stand-alone e-Health systems, which present challenges of healthcare information sharing across platforms, desires much attention for systems integration efforts. The lack of an EA conceptual framework resultantly crates the need for investigating an ideal solution to the objective of Health Information Systems interoperability development assessment. The study takes a qualitative exploratory approach through a design science research context. The research aims to study the various themes withdrawn from the literature that can help in the assessment of interoperable HISs development through a literature study. Themes derived from the study include HIS needs, HIS readiness, HIS constraints, and HIS technology integration elements and standards tied to the EA development architectural layers of The Open Group Architecture Framework (TOGAF) as an EA development methodology. Eventually, the themes were conceptualised into a framework reviewed by two experts. The essence of the study was to provide a framework within which interoperable EA of HISs should be developed.Keywords: enterprise architecture, eHealth, health information systems, interoperability
Procedia PDF Downloads 10810650 A Survey on Quasi-Likelihood Estimation Approaches for Longitudinal Set-ups
Authors: Naushad Mamode Khan
Abstract:
The Com-Poisson (CMP) model is one of the most popular discrete generalized linear models (GLMS) that handles both equi-, over- and under-dispersed data. In longitudinal context, an integer-valued autoregressive (INAR(1)) process that incorporates covariate specification has been developed to model longitudinal CMP counts. However, the joint likelihood CMP function is difficult to specify and thus restricts the likelihood based estimating methodology. The joint generalized quasilikelihood approach (GQL-I) was instead considered but is rather computationally intensive and may not even estimate the regression effects due to a complex and frequently ill conditioned covariance structure. This paper proposes a new GQL approach for estimating the regression parameters (GQLIII) that are based on a single score vector representation. The performance of GQL-III is compared with GQL-I and separate marginal GQLs (GQL-II) through some simulation experiments and is proved to yield equally efficient estimates as GQL-I and is far more computationally stable.Keywords: longitudinal, com-Poisson, ill-conditioned, INAR(1), GLMS, GQL
Procedia PDF Downloads 35510649 Iot-Based Interactive Patient Identification and Safety Management System
Authors: Jonghoon Chun, Insung Kim, Jonghyun Lim, Gun Ro
Abstract:
We believe that it is possible to provide a solution to reduce patient safety accidents by displaying correct medical records and prescription information through interactive patient identification. Our system is based on the use of smart bands worn by patients and these bands communicate with the hybrid gateways which understand both BLE and Wifi communication protocols. Through the convergence of low-power Bluetooth (BLE) and hybrid gateway technology, which is one of short-range wireless communication technologies, we implement ‘Intelligent Patient Identification and Location Tracking System’ to prevent medical malfunction frequently occurring in medical institutions. Based on big data and IOT technology using MongoDB, smart band (BLE, NFC function) and hybrid gateway, we develop a system to enable two-way communication between medical staff and hospitalized patients as well as to store locational information of the patients in minutes. Based on the precise information provided using big data systems, such as location tracking and movement of in-hospital patients wearing smart bands, our findings include the fact that a patient-specific location tracking algorithm can more efficiently operate HIS (Hospital Information System) and other related systems. Through the system, we can always correctly identify patients using identification tags. In addition, the system automatically determines whether the patient is a scheduled for medical service by the system in use at the medical institution, and displays the appropriateness of the medical treatment and the medical information (medical record and prescription information) on the screen and voice. This work was supported in part by the Korea Technology and Information Promotion Agency for SMEs (TIPA) grant funded by the Korean Small and Medium Business Administration (No. S2410390).Keywords: BLE, hybrid gateway, patient identification, IoT, safety management, smart band
Procedia PDF Downloads 31110648 Modern Information Security Management and Digital Technologies: A Comprehensive Approach to Data Protection
Authors: Mahshid Arabi
Abstract:
With the rapid expansion of digital technologies and the internet, information security has become a critical priority for organizations and individuals. The widespread use of digital tools such as smartphones and internet networks facilitates the storage of vast amounts of data, but simultaneously, vulnerabilities and security threats have significantly increased. The aim of this study is to examine and analyze modern methods of information security management and to develop a comprehensive model to counteract threats and information misuse. This study employs a mixed-methods approach, including both qualitative and quantitative analyses. Initially, a systematic review of previous articles and research in the field of information security was conducted. Then, using the Delphi method, interviews with 30 information security experts were conducted to gather their insights on security challenges and solutions. Based on the results of these interviews, a comprehensive model for information security management was developed. The proposed model includes advanced encryption techniques, machine learning-based intrusion detection systems, and network security protocols. AES and RSA encryption algorithms were used for data protection, and machine learning models such as Random Forest and Neural Networks were utilized for intrusion detection. Statistical analyses were performed using SPSS software. To evaluate the effectiveness of the proposed model, T-Test and ANOVA statistical tests were employed, and results were measured using accuracy, sensitivity, and specificity indicators of the models. Additionally, multiple regression analysis was conducted to examine the impact of various variables on information security. The findings of this study indicate that the comprehensive proposed model reduced cyber-attacks by an average of 85%. Statistical analysis showed that the combined use of encryption techniques and intrusion detection systems significantly improves information security. Based on the obtained results, it is recommended that organizations continuously update their information security systems and use a combination of multiple security methods to protect their data. Additionally, educating employees and raising public awareness about information security can serve as an effective tool in reducing security risks. This research demonstrates that effective and up-to-date information security management requires a comprehensive and coordinated approach, including the development and implementation of advanced techniques and continuous training of human resources.Keywords: data protection, digital technologies, information security, modern management
Procedia PDF Downloads 3310647 A Machine Learning Approach for Assessment of Tremor: A Neurological Movement Disorder
Authors: Rajesh Ranjan, Marimuthu Palaniswami, A. A. Hashmi
Abstract:
With the changing lifestyle and environment around us, the prevalence of the critical and incurable disease has proliferated. One such condition is the neurological disorder which is rampant among the old age population and is increasing at an unstoppable rate. Most of the neurological disorder patients suffer from some movement disorder affecting the movement of their body parts. Tremor is the most common movement disorder which is prevalent in such patients that infect the upper or lower limbs or both extremities. The tremor symptoms are commonly visible in Parkinson’s disease patient, and it can also be a pure tremor (essential tremor). The patients suffering from tremor face enormous trouble in performing the daily activity, and they always need a caretaker for assistance. In the clinics, the assessment of tremor is done through a manual clinical rating task such as Unified Parkinson’s disease rating scale which is time taking and cumbersome. Neurologists have also affirmed a challenge in differentiating a Parkinsonian tremor with the pure tremor which is essential in providing an accurate diagnosis. Therefore, there is a need to develop a monitoring and assistive tool for the tremor patient that keep on checking their health condition by coordinating them with the clinicians and caretakers for early diagnosis and assistance in performing the daily activity. In our research, we focus on developing a system for automatic classification of tremor which can accurately differentiate the pure tremor from the Parkinsonian tremor using a wearable accelerometer-based device, so that adequate diagnosis can be provided to the correct patient. In this research, a study was conducted in the neuro-clinic to assess the upper wrist movement of the patient suffering from Pure (Essential) tremor and Parkinsonian tremor using a wearable accelerometer-based device. Four tasks were designed in accordance with Unified Parkinson’s disease motor rating scale which is used to assess the rest, postural, intentional and action tremor in such patient. Various features such as time-frequency domain, wavelet-based and fast-Fourier transform based cross-correlation were extracted from the tri-axial signal which was used as input feature vector space for the different supervised and unsupervised learning tools for quantification of severity of tremor. A minimum covariance maximum correlation energy comparison index was also developed which was used as the input feature for various classification tools for distinguishing the PT and ET tremor types. An automatic system for efficient classification of tremor was developed using feature extraction methods, and superior performance was achieved using K-nearest neighbors and Support Vector Machine classifiers respectively.Keywords: machine learning approach for neurological disorder assessment, automatic classification of tremor types, feature extraction method for tremor classification, neurological movement disorder, parkinsonian tremor, essential tremor
Procedia PDF Downloads 15410646 The Impact of Bitcoin on Stock Market Performance
Authors: Oliver Takawira, Thembi Hope
Abstract:
This study will analyse the relationship between Bitcoin price movements and the Johannesburg stock exchange (JSE). The aim is to determine whether Bitcoin price movements affect the stock market performance. As crypto currencies continue to gain prominence as a safe asset during periods of economic distress, this raises the question of whether Bitcoin’s prosperity could affect investment in the stock market. To identify the existence of a short run and long run linear relationship, the study will apply the Autoregressive Distributed Lag Model (ARDL) bounds test and a Vector Error Correction Model (VECM) after testing the data for unit roots and cointegration using the Augmented Dicker Fuller (ADF) and Phillips-Perron (PP). The Non-Linear Auto Regressive Distributed Lag (NARDL) will then be used to check if there is a non-linear relationship between bitcoin prices and stock market prices.Keywords: bitcoin, stock market, interest rates, ARDL
Procedia PDF Downloads 10710645 A Study of Evolving Cloud Computing Data Security: A Machine Learning Perspective
Authors: Shinoy Vengaramkode Bhaskaran
Abstract:
The advancement of cloud computing led to a variety of security issues for both consumers and industries. Whereas machine learning (ML) is one approach to securing Cloud-based systems. Various methods have been employed to prevent or detect attacks and security vulnerabilities on the Cloud using ML techniques. In this paper, we present an ML perspective on the methodologies and techniques of cloud security. Initially, an investigative study on cloud computing is conducted with a primary emphasis on the gaps with two research questions that are impeding the adoption of cloud technology, as well as the challenges associated with threat solutions. Next, some ideas are generated based on machine learning methods to mitigate certain types of attacks that are frequently discussed through the application of ML techniques. Finally, we review different machine learning algorithms and their adoption in cloud computing.Keywords: artificial intelligence, machine learning, cloud computing infrastructure as a service, support vector machine, platform as a service
Procedia PDF Downloads 1010644 Hardware Implementation for the Contact Force Reconstruction in Tactile Sensor Arrays
Authors: María-Luisa Pinto-Salamanca, Wilson-Javier Pérez-Holguín
Abstract:
Reconstruction of contact forces is a fundamental technique for analyzing the properties of a touched object and is essential for regulating the grip force in slip control loops. This is based on the processing of the distribution, intensity, and direction of the forces during the capture of the sensors. Currently, efficient hardware alternatives have been used more frequently in different fields of application, allowing the implementation of computationally complex algorithms, as is the case with tactile signal processing. The use of hardware for smart tactile sensing systems is a research area that promises to improve the processing time and portability requirements of applications such as artificial skin and robotics, among others. The literature review shows that hardware implementations are present today in almost all stages of smart tactile detection systems except in the force reconstruction process, a stage in which they have been less applied. This work presents a hardware implementation of a model-driven reported in the literature for the contact force reconstruction of flat and rigid tactile sensor arrays from normal stress data. From the analysis of a software implementation of such a model, this implementation proposes the parallelization of tasks that facilitate the execution of matrix operations and a two-dimensional optimization function to obtain a vector force by each taxel in the array. This work seeks to take advantage of the parallel hardware characteristics of Field Programmable Gate Arrays, FPGAs, and the possibility of applying appropriate techniques for algorithms parallelization using as a guide the rules of generalization, efficiency, and scalability in the tactile decoding process and considering the low latency, low power consumption, and real-time execution as the main parameters of design. The results show a maximum estimation error of 32% in the tangential forces and 22% in the normal forces with respect to the simulation by the Finite Element Modeling (FEM) technique of Hertzian and non-Hertzian contact events, over sensor arrays of 10×10 taxels of different sizes. The hardware implementation was carried out on an MPSoC XCZU9EG-2FFVB1156 platform of Xilinx® that allows the reconstruction of force vectors following a scalable approach, from the information captured by means of tactile sensor arrays composed of up to 48 × 48 taxels that use various transduction technologies. The proposed implementation demonstrates a reduction in estimation time of x / 180 compared to software implementations. Despite the relatively high values of the estimation errors, the information provided by this implementation on the tangential and normal tractions and the triaxial reconstruction of forces allows to adequately reconstruct the tactile properties of the touched object, which are similar to those obtained in the software implementation and in the two FEM simulations taken as reference. Although errors could be reduced, the proposed implementation is useful for decoding contact forces for portable tactile sensing systems, thus helping to expand electronic skin applications in robotic and biomedical contexts.Keywords: contact forces reconstruction, forces estimation, tactile sensor array, hardware implementation
Procedia PDF Downloads 19610643 Solid Waste and Its Impact on the Human Health
Authors: Waseem Akram, Hafiz Azhar Ali Khan
Abstract:
Unplanned urbanization together with change in life from simple to more technologically advanced style with flow of rural masses to urban areas has played a vital role in pilling loads of solid wastes in our environment. The cities and towns have expanded beyond boundaries. Even the uncontrolled population expansion has caused the overall environmental burden. Thus, today the indifference remains as one of the biggest trash that has come up due to the non-responsive behavior of the people. Everyday huge amount of solid waste is thrown in the streets, on the roads, parks, and in all those places that are frequently and often visited by the human beings. This behavior based response in many countries of the world has led to serious health concerns and environmental issues. Over 80% of our products that are sold in the market are packed in plastic bags. None of the bags are later recycled but simply become a permanent environment concern that flies, choke lines or are burnt and release toxic gases in the environment or form dumps of heaps. Lack of classification of the daily waste generated from houses and other places lead to worst clogging of the sewerage lines and formation of ponding areas which ultimately favor vector borne disease and sometimes become a cause of transmission of polio virus. Solid waste heaps were checked at different places of the cities. All of the wastes on visual assessments were classified into plastic bags, papers, broken plastic pots, clay pots, steel boxes, wrappers etc. All solid waste dumping sites in the cities and wastes that were thrown outside of the trash containers usually contained wrappers, plastic bags, and unconsumed food products. Insect populations seen in these sites included the house flies, bugs, cockroaches and mosquito larvae breeding in water filled wrappers, containers or plastic bags. The population of the mosquitoes, cockroaches and houseflies were relatively very high in dumping sites close to human population. This population has been associated with cases like dengue, malaria, dysentery, gastro and also to skin allergies during the monsoon and summer season. Thus, dumping of the huge amount of solid wastes in and near the residential areas results into serious environmental concerns, bad smell circulation, and health related issues. In some places, the same waste is burnt to get rid of mosquitoes through smoke which ultimately releases toxic material in the atmosphere. Therefore, a proper environmental strategy is needed to minimize environmental burden and promote concepts of recycled products and thus, reduce the disease burden.Keywords: solid waste accumulation, disease burden, mosquitoes, vector borne diseases
Procedia PDF Downloads 28010642 A Framework for Enhancing Mobile Development Software for Rangsit University, Thailand
Authors: Thossaporn Thossansin
Abstract:
This paper presents the developing of a mobile application for students who are studying in a Faculty of Information Technology, Rangsit University (RSU), Thailand. RSU enhanced the enrollment process by leveraging its information systems, which allows students to download RSU APP. This helps students to access RSU’s information that is important for them. The reason to have a mobile application is to give support students’ ability to access the system at anytime, anywhere and anywhere. The objective of this paper was to develop an application on iOS platform for students who are studying in Faculty of Information Technology, Rangsit University, Thailand. Studies and learns student’s perception for a new mobile app. This paper has targeted a group of students who is studied in year 1-4 in the faculty of information technology, Rangsit University. This new application has been developed by the department of information technology, Rangsit University and it has generally called as RSU APP. This is a new mobile application development for RSU, which has useful features and functionalities in giving support to students. The core module has consisted of RSU’s announcement, calendar, event, activities, and ebook. The mobile app has developed on iOS platform that is related to RSU’s policies in giving free Tablets for the first year students. The user satisfaction is analyzed from interview data that has 81 interviews and Google application such as google form is taken into account for 122 interviews. Generally, users were satisfied to-use application with the most satisfaction at the level of 4.67. SD is 0.52, which found the most satisfaction in that users can learn and use quickly. The most satisfying is 4.82 and SD is 0.71 and the lowest satisfaction rating in its modern form, apps lists. The satisfaction is 4.01, and SD is 0.45.Keywords: mobile application, development of mobile application, framework of mobile development, software development for mobile devices
Procedia PDF Downloads 32610641 The Fiscal-Monetary Policy and Economic Growth in Algeria: VECM Approach
Authors: K. Bokreta, D. Benanaya
Abstract:
The objective of this study is to examine the relative effectiveness of monetary and fiscal policy in Algeria using the econometric modelling techniques of cointegration and vector error correction modelling to analyse and draw policy inferences. The chosen variables of fiscal policy are government expenditure and net taxes on products, while the effect of monetary policy is presented by the inflation rate and the official exchange rate. From the results, we find that in the long-run, the impact of government expenditures is positive, while the effect of taxes is negative on growth. Additionally, we find that the inflation rate is found to have little effect on GDP per capita but the impact of the exchange rate is insignificant. We conclude that fiscal policy is more powerful then monetary policy in promoting economic growth in Algeria.Keywords: economic growth, monetary policy, fiscal policy, VECM
Procedia PDF Downloads 31110640 Information Disclosure And Financial Sentiment Index Using a Machine Learning Approach
Authors: Alev Atak
Abstract:
In this paper, we aim to create a financial sentiment index by investigating the company’s voluntary information disclosures. We retrieve structured content from BIST 100 companies’ financial reports for the period 1998-2018 and extract relevant financial information for sentiment analysis through Natural Language Processing. We measure strategy-related disclosures and their cross-sectional variation and classify report content into generic sections using synonym lists divided into four main categories according to their liquidity risk profile, risk positions, intra-annual information, and exposure to risk. We use Word Error Rate and Cosin Similarity for comparing and measuring text similarity and derivation in sets of texts. In addition to performing text extraction, we will provide a range of text analysis options, such as the readability metrics, word counts using pre-determined lists (e.g., forward-looking, uncertainty, tone, etc.), and comparison with reference corpus (word, parts of speech and semantic level). Therefore, we create an adequate analytical tool and a financial dictionary to depict the importance of granular financial disclosure for investors to identify correctly the risk-taking behavior and hence make the aggregated effects traceable.Keywords: financial sentiment, machine learning, information disclosure, risk
Procedia PDF Downloads 9410639 An Approach Based on Statistics and Multi-Resolution Representation to Classify Mammograms
Authors: Nebi Gedik
Abstract:
One of the significant and continual public health problems in the world is breast cancer. Early detection is very important to fight the disease, and mammography has been one of the most common and reliable methods to detect the disease in the early stages. However, it is a difficult task, and computer-aided diagnosis (CAD) systems are needed to assist radiologists in providing both accurate and uniform evaluation for mass in mammograms. In this study, a multiresolution statistical method to classify mammograms as normal and abnormal in digitized mammograms is used to construct a CAD system. The mammogram images are represented by wave atom transform, and this representation is made by certain groups of coefficients, independently. The CAD system is designed by calculating some statistical features using each group of coefficients. The classification is performed by using support vector machine (SVM).Keywords: wave atom transform, statistical features, multi-resolution representation, mammogram
Procedia PDF Downloads 22310638 Nursing Documentation of Patients' Information at Selected Primary Health Care Facilities in Limpopo Province, South Africa: Implications for Professional Practice
Authors: Maria Sonto Maputle, Rhulani C. Shihundla, Rachel T. Lebese
Abstract:
Background: Patients’ information must be complete and accurately documented in order to foster quality and continuity of care. The multidisciplinary health care members use patients’ documentation to communicate about health status, preventive health services, treatment, planning and delivery of care. The purpose of this study was to determine the practice of nursing documentation of patients’ information at selected Primary Health Care (PHC) facilities in Vhembe District, Limpopo Province, South Africa. Methods: The research approach adopted was qualitative while exploratory and descriptive design was used. The study was conducted at selected PHC facilities. Population included twelve professional nurses. Non-probability purposive sampling method was used to sample professional nurses who were willing to participate in the study. The criteria included participants’ whose daily work and activities, involved creating, keeping and updating nursing documentation of patients’ information. Qualitative data collection was through unstructured in-depth interviews until no new information emerged. Data were analysed through open–coding of, Tesch’s eight steps method. Results: Following data analysis, it was found that professional nurses’ had knowledge deficit related to insufficient training on updates and rendering multiple services daily had negative impact on accurate documentation of patients’ information. Conclusion: The study recommended standardization of registers, books and forms used at PHC facilities, and reorganization of PHC services into open day system.Keywords: documentation, knowledge, patient care, patient’s information, training
Procedia PDF Downloads 19010637 Critical Success Factors of Information Technology Projects
Authors: Athar Imtiaz, Abduljalil S. Al-Mudhary, Taha Mirhashemi, Roslina Ibrahim
Abstract:
Information Technology (IT) is being used by almost all organizations throughout the world. However, its success at supporting and improving business is debatable. There is always the risk of IT project failure and studies have proven that a large number of IT projects indeed do fail. There are many components that further the success of IT projects; these have been studied in previous studies. Studies have found the most necessary components for success in software development projects, executive information systems etc. In this study, previous literature that has looked into these success promoting factors have been critically reviewed and analyzed. Fifteen critical Success Factors (CSF) of IT projects were enlisted and examined. These factors can be applied to all IT projects and is not specific to a particular type of IT/IS project. A hypothesis was also generated after the evaluation of the factors.Keywords: critical success factors, CSF, IT projects, IS projects, software development projects
Procedia PDF Downloads 40010636 Understanding the Basics of Information Security: An Act of Defense
Authors: Sharon Q. Yang, Robert J. Congleton
Abstract:
Information security is a broad concept that covers any issues and concerns about the proper access and use of information on the Internet, including measures and procedures to protect intellectual property and private data from illegal access and online theft; the act of hacking; and any defensive technologies that contest such cybercrimes. As more research and commercial activities are conducted online, cybercrimes have increased significantly, putting sensitive information at risk. Information security has become critically important for organizations and private citizens alike. Hackers scan for network vulnerabilities on the Internet and steal data whenever they can. Cybercrimes disrupt our daily life, cause financial losses, and instigate fear in the public. Since the start of the pandemic, most data related cybercrimes targets have been either financial or health information from companies and organizations. Libraries also should have a high interest in understanding and adopting information security methods to protect their patron data and copyrighted materials. But according to information security professionals, higher education and cultural organizations, including their libraries, are the least prepared entities for cyberattacks. One recent example is that of Steven’s Institute of Technology in New Jersey in the US, which had its network hacked in 2020, with the hackers demanding a ransom. As a result, the network of the college was down for two months, causing serious financial loss. There are other cases where libraries, colleges, and universities have been targeted for data breaches. In order to build an effective defense, we need to understand the most common types of cybercrimes, including phishing, whaling, social engineering, distributed denial of service (DDoS) attacks, malware and ransomware, and hacker profiles. Our research will focus on each hacking technique and related defense measures; and the social background and reasons/purpose of hacker and hacking. Our research shows that hacking techniques will continue to evolve as new applications, housing information, and data on the Internet continue to be developed. Some cybercrimes can be stopped with effective measures, while others present challenges. It is vital that people understand what they face and the consequences when not prepared.Keywords: cybercrimes, hacking technologies, higher education, information security, libraries
Procedia PDF Downloads 13510635 Development of a Technology Assessment Model by Patents and Customers' Review Data
Authors: Kisik Song, Sungjoo Lee
Abstract:
Recent years have seen an increasing number of patent disputes due to excessive competition in the global market and a reduced technology life-cycle; this has increased the risk of investment in technology development. While many global companies have started developing a methodology to identify promising technologies and assess for decisions, the existing methodology still has some limitations. Post hoc assessments of the new technology are not being performed, especially to determine whether the suggested technologies turned out to be promising. For example, in existing quantitative patent analysis, a patent’s citation information has served as an important metric for quality assessment, but this analysis cannot be applied to recently registered patents because such information accumulates over time. Therefore, we propose a new technology assessment model that can replace citation information and positively affect technological development based on post hoc analysis of the patents for promising technologies. Additionally, we collect customer reviews on a target technology to extract keywords that show the customers’ needs, and we determine how many keywords are covered in the new technology. Finally, we construct a portfolio (based on a technology assessment from patent information) and a customer-based marketability assessment (based on review data), and we use them to visualize the characteristics of the new technologies.Keywords: technology assessment, patents, citation information, opinion mining
Procedia PDF Downloads 46610634 Social Accountability: Persuasion and Debate to Contain Corruption
Authors: A. Lambert-Mogiliansky
Abstract:
In this paper, we investigate the properties of simple rules for reappointment aimed at holding a public official accountable and monitor his activity. The public official allocates budget resources to various activities which results in the delivery of public services to citizens. He has discretion over the use of resource so he can divert some of them for private ends. Because of a liability constraint, zero diversion can never be secured in all states. The optimal reappointment mechanism under complete information is shown to exhibit some leniency thus departing from the zero tolerance principle. Under asymmetric information (about the state), a rule with random verification in a pre-announced subset is shown to be optimal in a class of common rules. Surprisingly, those common rules make little use of hard information about service delivery when available. Similarly, PO's claim about his record is of no value to improve the performance of the examined rules. In contrast requesting that the PO defends his records publicly can be very useful if the service users are given the chance to refute false claims with cheap talk complaints: the first best complete information outcome can be approached in the absence of any observation by the manager of the accountability mechanism.Keywords: accountability, corruption, persuasion, debate
Procedia PDF Downloads 38210633 Multiresolution Mesh Blending for Surface Detail Reconstruction
Authors: Honorio Salmeron Valdivieso, Andy Keane, David Toal
Abstract:
In the area of mechanical reverse engineering, processes often encounter difficulties capturing small, highly localized surface information. This could be the case if a physical turbine was 3D scanned for lifecycle management or robust design purposes, with interest on eroded areas or scratched coating. The limitation partly is due to insufficient automated frameworks for handling -localized - surface information during the reverse engineering pipeline. We have developed a tool for blending surface patches with arbitrary irregularities into a base body (e.g. a CAD solid). The approach aims to transfer small surface features while preserving their shape and relative placement by using a multi-resolution scheme and rigid deformations. Automating this process enables the inclusion of outsourced surface information in CAD models, including samples prepared in mesh handling software, or raw scan information discarded in the early stages of reverse engineering reconstruction.Keywords: application lifecycle management, multiresolution deformation, reverse engineering, robust design, surface blending
Procedia PDF Downloads 14010632 Process for Analyzing Information Security Risks Associated with the Incorporation of Online Dispute Resolution Systems in the Context of Conciliation in Colombia
Authors: Jefferson Camacho Mejia, Jenny Paola Forero Pachon, Luis Carlos Gomez Florez
Abstract:
The innumerable possibilities offered by the use of Information Technology (IT) in the development of different socio-economic activities has made a change in the social paradigm and the emergence of the so-called information and knowledge society. The Colombian government, aware of this reality, has been promoting the use of IT as part of the E-government strategy adopted in the country. However, it is well known that the use of IT implies the existence of certain threats that put the security of information in the digital environment at risk. One of the priorities of the Colombian government is to improve access to alternative justice through IT, in particular, access to Alternative Dispute Resolution (ADR): conciliation, arbitration and friendly composition; by means of which it is sought that the citizens directly resolve their differences. To this end, a trend has been identified in the use of Online Dispute Resolution (ODR) systems, which extend the benefits of ADR to the digital environment through the use of IT. This article presents a process for the analysis of information security risks associated with the incorporation of ODR systems in the context of conciliation in Colombia, based on four fundamental stages identified in the literature: (I) Identification of assets, (II) Identification of threats and vulnerabilities (III) Estimation of the impact and 4) Estimation of risk levels. The methodological design adopted for this research was the grounded theory, since it involves interactions that are applied to a specific context and from the perspective of diverse participants. As a result of this investigation, the activities to be followed are defined to carry out an analysis of information security risks, in the context of the conciliation in Colombia supported by ODR systems, thus contributing to the estimation of the risks to make possible its subsequent treatment.Keywords: alternative dispute resolution, conciliation, information security, online dispute resolution systems, process, risk analysis
Procedia PDF Downloads 24110631 Development of the Academic Model to Predict Student Success at VUT-FSASEC Using Decision Trees
Authors: Langa Hendrick Musawenkosi, Twala Bhekisipho
Abstract:
The success or failure of students is a concern for every academic institution, college, university, governments and students themselves. Several approaches have been researched to address this concern. In this paper, a view is held that when a student enters a university or college or an academic institution, he or she enters an academic environment. The academic environment is unique concept used to develop the solution for making predictions effectively. This paper presents a model to determine the propensity of a student to succeed or fail in the French South African Schneider Electric Education Center (FSASEC) at the Vaal University of Technology (VUT). The Decision Tree algorithm is used to implement the model at FSASEC.Keywords: FSASEC, academic environment model, decision trees, k-nearest neighbor, machine learning, popularity index, support vector machine
Procedia PDF Downloads 200