Search results for: automatic reporting
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1614

Search results for: automatic reporting

1254 Assessment of the Level of Awareness and Adoption of International Public Sector Accounting Standards (IPSAS) in the Curriculum of Accounting Education in Selected Tertiary Institutions in Ondo and Ekiti States Nigeria

Authors: Olurankinse Felix, Fatukasi Bayo

Abstract:

Over the years, the medium through which government financial statements are prepared has been on cash basis of accounting. This basis was characterised with some shortcomings ranging from non- disclosure of quality and detail information relating to government financial transactions, ill informed assessment of government resource allocation, weak internal control system that inhibits accountability and transparency and non- standardisation of reporting ethics for the purpose of comparability. The emergence of international public sector accounting standards (IPSAS) is therefore seen as leverage as it aims at improving the quality of general purpose financial reporting by public sector entities thereby increasing transparency and accountability. IPSAS is a new concept that all institutions must fully adopts. The crux of this paper is to find out to what extent is the awareness and adoption of IPSAS to both students and lecturers interms of teaching, learning and inclusion in the curriculum of accounting education. The methodology involved the use of well designed questionnaires to obtain information from some selected institutions and the analysis was done with the use of maximum likelihood ordered probit regression. The result of the analysis shows that despite a high level of sensitisation/awareness of IPSAS, the degree of adoption is still low due to low level of desirability by students and lecturers. The paper recommend the need for the government to enact an enabling law to back up the adoption and more importantly to institute appropriate sanctions to ensure full compliance.

Keywords: assessment, awareness, adoption, IPSAS, cash basis

Procedia PDF Downloads 451
1253 Effectiveness and Efficiency of Unified Philippines Accident Reporting and Database System in Optimizing Road Crash Data Usage with Various Stakeholders

Authors: Farhad Arian Far, Anjanette Q. Eleazar, Francis Aldrine A. Uy, Mary Joyce Anne V. Uy

Abstract:

The Unified Philippine Accident Reporting and Database System (UPARDS), is a newly developed system by Dr. Francis Aldrine Uy of the Mapua Institute of Technology. The main purpose is to provide an advanced road accident investigation tool, record keeping and analysis system for stakeholders such as Philippine National Police (PNP), Metro Manila Development Authority (MMDA), Department of Public Works and Highways (DPWH), Department of Health (DOH), and insurance companies. The system is composed of 2 components, the mobile application for road accident investigators that takes advantage of available technology to advance data gathering and the web application that integrates all accident data for the use of all stakeholders. The researchers with the cooperation of PNP’s Vehicle Traffic Investigation Sector of the City of Manila, conducted the field-testing of the application in fifteen (15) accident cases. Simultaneously, the researchers also distributed surveys to PNP, Manila Doctors Hospital, and Charter Ping An Insurance Company to gather their insights regarding the web application. The survey was designed on information systems theory called Technology Acceptance Model. The results of the surveys revealed that the respondents were greatly satisfied with the visualization and functions of the applications as it proved to be effective and far more efficient in comparison with the conventional pen-and-paper method. In conclusion, the pilot study was able to address the need for improvement of the current system.

Keywords: accident, database, investigation, mobile application, pilot testing

Procedia PDF Downloads 411
1252 Video Object Segmentation for Automatic Image Annotation of Ethernet Connectors with Environment Mapping and 3D Projection

Authors: Marrone Silverio Melo Dantas Pedro Henrique Dreyer, Gabriel Fonseca Reis de Souza, Daniel Bezerra, Ricardo Souza, Silvia Lins, Judith Kelner, Djamel Fawzi Hadj Sadok

Abstract:

The creation of a dataset is time-consuming and often discourages researchers from pursuing their goals. To overcome this problem, we present and discuss two solutions adopted for the automation of this process. Both optimize valuable user time and resources and support video object segmentation with object tracking and 3D projection. In our scenario, we acquire images from a moving robotic arm and, for each approach, generate distinct annotated datasets. We evaluated the precision of the annotations by comparing these with a manually annotated dataset, as well as the efficiency in the context of detection and classification problems. For detection support, we used YOLO and obtained for the projection dataset an F1-Score, accuracy, and mAP values of 0.846, 0.924, and 0.875, respectively. Concerning the tracking dataset, we achieved an F1-Score of 0.861, an accuracy of 0.932, whereas mAP reached 0.894. In order to evaluate the quality of the annotated images used for classification problems, we employed deep learning architectures. We adopted metrics accuracy and F1-Score, for VGG, DenseNet, MobileNet, Inception, and ResNet. The VGG architecture outperformed the others for both projection and tracking datasets. It reached an accuracy and F1-score of 0.997 and 0.993, respectively. Similarly, for the tracking dataset, it achieved an accuracy of 0.991 and an F1-Score of 0.981.

Keywords: RJ45, automatic annotation, object tracking, 3D projection

Procedia PDF Downloads 136
1251 Deep Learning Application for Object Image Recognition and Robot Automatic Grasping

Authors: Shiuh-Jer Huang, Chen-Zon Yan, C. K. Huang, Chun-Chien Ting

Abstract:

Since the vision system application in industrial environment for autonomous purposes is required intensely, the image recognition technique becomes an important research topic. Here, deep learning algorithm is employed in image system to recognize the industrial object and integrate with a 7A6 Series Manipulator for object automatic gripping task. PC and Graphic Processing Unit (GPU) are chosen to construct the 3D Vision Recognition System. Depth Camera (Intel RealSense SR300) is employed to extract the image for object recognition and coordinate derivation. The YOLOv2 scheme is adopted in Convolution neural network (CNN) structure for object classification and center point prediction. Additionally, image processing strategy is used to find the object contour for calculating the object orientation angle. Then, the specified object location and orientation information are sent to robotic controller. Finally, a six-axis manipulator can grasp the specific object in a random environment based on the user command and the extracted image information. The experimental results show that YOLOv2 has been successfully employed to detect the object location and category with confidence near 0.9 and 3D position error less than 0.4 mm. It is useful for future intelligent robotic application in industrial 4.0 environment.

Keywords: deep learning, image processing, convolution neural network, YOLOv2, 7A6 series manipulator

Procedia PDF Downloads 215
1250 Development of Energy Benchmarks Using Mandatory Energy and Emissions Reporting Data: Ontario Post-Secondary Residences

Authors: C. Xavier Mendieta, J. J McArthur

Abstract:

Governments are playing an increasingly active role in reducing carbon emissions, and a key strategy has been the introduction of mandatory energy disclosure policies. These policies have resulted in a significant amount of publicly available data, providing researchers with a unique opportunity to develop location-specific energy and carbon emission benchmarks from this data set, which can then be used to develop building archetypes and used to inform urban energy models. This study presents the development of such a benchmark using the public reporting data. The data from Ontario’s Ministry of Energy for Post-Secondary Educational Institutions are being used to develop a series of building archetype dynamic building loads and energy benchmarks to fill a gap in the currently available building database. This paper presents the development of a benchmark for college and university residences within ASHRAE climate zone 6 areas in Ontario using the mandatory disclosure energy and greenhouse gas emissions data. The methodology presented includes data cleaning, statistical analysis, and benchmark development, and lessons learned from this investigation are presented and discussed to inform the development of future energy benchmarks from this larger data set. The key findings from this initial benchmarking study are: (1) the importance of careful data screening and outlier identification to develop a valid dataset; (2) the key features used to develop a model of the data are building age, size, and occupancy schedules and these can be used to estimate energy consumption; and (3) policy changes affecting the primary energy generation significantly affected greenhouse gas emissions, and consideration of these factors was critical to evaluate the validity of the reported data.

Keywords: building archetypes, data analysis, energy benchmarks, GHG emissions

Procedia PDF Downloads 274
1249 In-Context Meta Learning for Automatic Designing Pretext Tasks for Self-Supervised Image Analysis

Authors: Toktam Khatibi

Abstract:

Self-supervised learning (SSL) includes machine learning models that are trained on one aspect and/or one part of the input to learn other aspects and/or part of it. SSL models are divided into two different categories, including pre-text task-based models and contrastive learning ones. Pre-text tasks are some auxiliary tasks learning pseudo-labels, and the trained models are further fine-tuned for downstream tasks. However, one important disadvantage of SSL using pre-text task solving is defining an appropriate pre-text task for each image dataset with a variety of image modalities. Therefore, it is required to design an appropriate pretext task automatically for each dataset and each downstream task. To the best of our knowledge, the automatic designing of pretext tasks for image analysis has not been considered yet. In this paper, we present a framework based on In-context learning that describes each task based on its input and output data using a pre-trained image transformer. Our proposed method combines the input image and its learned description for optimizing the pre-text task design and its hyper-parameters using Meta-learning models. The representations learned from the pre-text tasks are fine-tuned for solving the downstream tasks. We demonstrate that our proposed framework outperforms the compared ones on unseen tasks and image modalities in addition to its superior performance for previously known tasks and datasets.

Keywords: in-context learning (ICL), meta learning, self-supervised learning (SSL), vision-language domain, transformers

Procedia PDF Downloads 48
1248 Tool for Maxillary Sinus Quantification in Computed Tomography Exams

Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina

Abstract:

The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.

Keywords: maxillary sinus, support vector machine, region growing, volume quantification

Procedia PDF Downloads 482
1247 HPTLC Fingerprint Profiling of Protorhus longifolia Methanolic Leaf Extract and Qualitative Analysis of Common Biomarkers

Authors: P. S. Seboletswe, Z. Mkhize, L. M. Katata-Seru

Abstract:

Protorhus longifolia is known as a medicinal plant that has been used traditionally to treat various ailments such as hemiplegic paralysis, blood clotting related diseases, diarrhoea, heartburn, etc. The study reports a High-Performance Thin Layer Chromatography (HPTLC) fingerprint profile of Protorhus longifolia methanolic extract and its qualitative analysis of gallic acid, rutin, and quercetin. HPTLC analysis was achieved using CAMAG HPTLC system equipped with CAMAG automatic TLC sampler 4, CAMAG Automatic Developing Chamber 2 (ADC2), CAMAG visualizer 2, CAMAG Thin Layer Chromatography (TLC) scanner and visionCATS CAMAG HPTLC software. Mobile phase comprising toluene, ethyl acetate, formic acid (21:15:3) was used for qualitative analysis of gallic acid and revealed eight peaks while the mobile phase containing ethyl acetate, water, glacial acetic acid, formic acid (100:26:11:11) for qualitative analysis of rutin and quercetin revealed six peaks. HPTLC sillica gel 60 F254 glass plates (10 × 10) were used as the stationary phase. Gallic acid was detected at the Rf = 0.35; while rutin and quercetin were not evident in the extract. Further studies will be performed to quantify gallic acid in Protorhus longifolia leaves and also identify other biomarkers.

Keywords: biomarkers, fingerprint profiling, gallic acid, HPTLC, Protorhus longifolia

Procedia PDF Downloads 116
1246 Impact of Human Resources Accounting on Employees' Performance in Organization

Authors: Hamid Saremi, Shida Hanafi

Abstract:

In an age of technology and economics, human capital has important and axial role in the organization and human resource accounting has a wide perception to key resources of organization i.e. human resources. Human resources accounting is new branch of accounting that has Short-lived and generally deals to a range of policies and measures that are related to various aspects of human resources and It gives importance to an organization's most important asset is its human resources and human resource management is the key to success in an organization and to achieve this important matter must review and evaluation of human resources data be with knowledge of accounting based on empirical studies and methods of measurement and reporting of human resources accounting information. Undoubtedly human resource management without information cannot be done and take decision and human resources accounting is practical way to inform the decision makers who are committed to harnessing human resources,, human resources accounting with applying accounting principles in the organization and is with conducting basic research on the extent of the of human resources accounting information" effect of employees' personal performance. In human resource accounting analysis and criteria and valuation of cost and manpower valuating is as the main resource in each Institute. Protection of human resources is a process that according to human resources accounting is for organization profitability. In fact, this type of accounting can be called as a major source in measurement and trends of costs and human resources valuation in each institution. What is the economic value of such assets? What is the amount of expenditures for education and training of professional individuals to value in asset account? What amount of funds spent should be considered as lost opportunity cost? In this paper, according to the literature of human resource accounting we have studied the human resources matter and its objectives and topic of the importance of human resource valuation on employee performance review and method of reporting of human resources according to different models.

Keywords: human resources, human resources, accounting, human capital, human resource management, valuation and cost of human resources, employees, performance, organization

Procedia PDF Downloads 521
1245 An Empirical Study of the International Financial Reporting Standards Education in the United States

Authors: Angela McCaskill

Abstract:

Accounting graduates in most United States universities are not being adequately taught International Financial Reporting Standards (IFRS). As such they are not prepared with the knowledge and skills necessary to remain competitive in international businesses. One of the reasons behind the ill preparation is the lack of specific international accounting instruction available in the U.S. This paper explores the importance of IFRS education through the lenses of graduate accounting majors. The paper specifically explores graduate accounting major’s preparedness in IFRS based on their recent completion of a Master in Accountancy degree where IFRS had been integrated. The data for the study was collected via face-to face and telephone/Skype interviews and questionnaires. After the interview the participants also agreed to answer two supplementary questions. The participants were to determine the amounts that should be reported on the balance sheet under (1) IFRS and (2) U.S. GAAP. These questions intended to test their knowledge of both sets of standards. The sample consisted of on-line and brick and mortar university students enrolled in their graduate program during the period from spring semester 2016 to summer semester 2016. This study shows that a separate course should be devoted to teaching IFRS and convergence related issues. There is a direct correlation between the knowledge level of those students taking an IFRS course and the successful completion of the supplementary questions compared to those who only had IFRS instruction mixed into their U.S. GAAP based instruction. Students who took an international accounting course were better prepared for the IFRS conversion than those who did not have a separate course. Academically, universities need to take a deeper look into the needs of their students and do better at incorporating international standards in their curriculum.

Keywords: accounting education, global accounting standards, international accounting, IFRS and U.S. GAAP convergence, IFRS, U.S. GAAP

Procedia PDF Downloads 229
1244 Wolof Voice Response Recognition System: A Deep Learning Model for Wolof Audio Classification

Authors: Krishna Mohan Bathula, Fatou Bintou Loucoubar, FNU Kaleemunnisa, Christelle Scharff, Mark Anthony De Castro

Abstract:

Voice recognition algorithms such as automatic speech recognition and text-to-speech systems with African languages can play an important role in bridging the digital divide of Artificial Intelligence in Africa, contributing to the establishment of a fully inclusive information society. This paper proposes a Deep Learning model that can classify the user responses as inputs for an interactive voice response system. A dataset with Wolof language words ‘yes’ and ‘no’ is collected as audio recordings. A two stage Data Augmentation approach is adopted for enhancing the dataset size required by the deep neural network. Data preprocessing and feature engineering with Mel-Frequency Cepstral Coefficients are implemented. Convolutional Neural Networks (CNNs) have proven to be very powerful in image classification and are promising for audio processing when sounds are transformed into spectra. For performing voice response classification, the recordings are transformed into sound frequency feature spectra and then applied image classification methodology using a deep CNN model. The inference model of this trained and reusable Wolof voice response recognition system can be integrated with many applications associated with both web and mobile platforms.

Keywords: automatic speech recognition, interactive voice response, voice response recognition, wolof word classification

Procedia PDF Downloads 88
1243 Impacts of Applying Automated Vehicle Location Systems to Public Bus Transport Management

Authors: Vani Chintapally

Abstract:

The expansion of modest and minimized Global Positioning System (GPS) beneficiaries has prompted most Automatic Vehicle Location (AVL) frameworks today depending solely on satellite-based finding frameworks, as GPS is the most stable usage of these. This paper shows the attributes of a proposed framework for following and dissecting open transport in a run of the mill medium-sized city and complexities the qualities of such a framework to those of broadly useful AVL frameworks. Particular properties of the courses broke down by the AVL framework utilized for the examination of open transport in our study incorporate cyclic vehicle courses, the requirement for particular execution reports, and so forth. This paper particularly manages vehicle movement forecasts and the estimation of station landing time, combined with consequently produced reports on timetable conformance and other execution measures. Another side of the watched issue is proficient exchange of information from the vehicles to the control focus. The pervasiveness of GSM bundle information exchange advancements combined with decreased information exchange expenses have brought on today's AVL frameworks to depend predominantly on parcel information exchange administrations from portable administrators as the correspondences channel in the middle of vehicles and the control focus. This methodology brings numerous security issues up in this conceivably touchy application field.

Keywords: automatic vehicle location (AVL), expectation of landing times, AVL security, data administrations, wise transport frameworks (ITS), guide coordinating

Procedia PDF Downloads 359
1242 International Financial Reporting Standards and the Quality of Banks Financial Statement Information: Evidence from an Emerging Market-Nigeria

Authors: Ugbede Onalo, Mohd Lizam, Ahmad Kaseri, Otache Innocent

Abstract:

Giving the paucity of studies on IFRS adoption and quality of banks accounting quality, particularly in emerging economies, this study is motivated to investigate whether the Nigeria decision to adopt IFRS beginning from 1 January 2012 is associated with high quality accounting measures. Consistent with prior literatures, this study measure quality of financial statement information using earnings measurement, timeliness of loss recognition and value relevance. A total of twenty Nigeria banks covering a period of six years (2008-2013) divided equally into three years each (2008, 2009, 2010) pre adoption period and (2011, 2012, 2013) post adoption period were investigated. Following prior studies eight models were in all employed to investigate earnings management, timeliness of loss recognition and value relevance of Nigeria bank accounting quality for the different reporting regimes. Results suggest that IFRS adoption is associated with minimal earnings management, timely recognition of losses and high value relevance of accounting information. Summarily, IFRS adoption engenders higher quality of banks financial statement information compared to local GAAP. Hence, this study recommends the global adoption of IFRS and that Nigeria banks should embrace good corporate governance practices.

Keywords: IFRS, SAS, quality of accounting information, earnings measurement, discretionary accruals, non-discretionary accruals, total accruals, Jones model, timeliness of loss recognition, value relevance

Procedia PDF Downloads 439
1241 The Automatisation of Dictionary-Based Annotation in a Parallel Corpus of Old English

Authors: Ana Elvira Ojanguren Lopez, Javier Martin Arista

Abstract:

The aims of this paper are to present the automatisation procedure adopted in the implementation of a parallel corpus of Old English, as well as, to assess the progress of automatisation with respect to tagging, annotation, and lemmatisation. The corpus consists of an aligned parallel text with word-for-word comparison Old English-English that provides the Old English segment with inflectional form tagging (gloss, lemma, category, and inflection) and lemma annotation (spelling, meaning, inflectional class, paradigm, word-formation and secondary sources). This parallel corpus is intended to fill a gap in the field of Old English, in which no parallel and/or lemmatised corpora are available, while the average amount of corpus annotation is low. With this background, this presentation has two main parts. The first part, which focuses on tagging and annotation, selects the layouts and fields of lexical databases that are relevant for these tasks. Most information used for the annotation of the corpus can be retrieved from the lexical and morphological database Nerthus and the database of secondary sources Freya. These are the sources of linguistic and metalinguistic information that will be used for the annotation of the lemmas of the corpus, including morphological and semantic aspects as well as the references to the secondary sources that deal with the lemmas in question. Although substantially adapted and re-interpreted, the lemmatised part of these databases draws on the standard dictionaries of Old English, including The Student's Dictionary of Anglo-Saxon, An Anglo-Saxon Dictionary, and A Concise Anglo-Saxon Dictionary. The second part of this paper deals with lemmatisation. It presents the lemmatiser Norna, which has been implemented on Filemaker software. It is based on a concordance and an index to the Dictionary of Old English Corpus, which comprises around three thousand texts and three million words. In its present state, the lemmatiser Norna can assign lemma to around 80% of textual forms on an automatic basis, by searching the index and the concordance for prefixes, stems and inflectional endings. The conclusions of this presentation insist on the limits of the automatisation of dictionary-based annotation in a parallel corpus. While the tagging and annotation are largely automatic even at the present stage, the automatisation of alignment is pending for future research. Lemmatisation and morphological tagging are expected to be fully automatic in the near future, once the database of secondary sources Freya and the lemmatiser Norna have been completed.

Keywords: corpus linguistics, historical linguistics, old English, parallel corpus

Procedia PDF Downloads 176
1240 Effect of Automatic Self Transcending Meditation on Perceived Stress and Sleep Quality in Adults

Authors: Divya Kanchibhotla, Shashank Kulkarni, Shweta Singh

Abstract:

Chronic stress and sleep quality reduces mental health and increases the risk of developing depression and anxiety as well. There is increasing evidence for the utility of meditation as an adjunct clinical intervention for conditions like depression and anxiety. The present study is an attempt to explore the impact of Sahaj Samadhi Meditation (SSM), a category of Automatic Self Transcending Meditation (ASTM), on perceived stress and sleep quality in adults. The study design was a single group pre-post assessment. Perceived Stress Scale (PSS) and the Pittsburgh Sleep Quality Index (PSQI) were used in this study. Fifty-two participants filled PSS, and 60 participants filled PSQI at the beginning of the program (day 0), after two weeks (day 16) and at two months (day 60). Significant pre-post differences for the perceived stress level on Day 0 - Day 16 (p < 0.01; Cohen's d = 0.46) and Day 0 - Day 60 (p < 0.01; Cohen's d = 0.76) clearly demonstrated that by practicing SSM, participants experienced reduction in the perceived stress. The effect size of the intervention observed on the 16th day of assessment was small to medium, but on the 60th day, a medium to large effect size of the intervention was observed. In addition to this, significant pre-post differences for the sleep quality on Day 0 - Day 16 and Day 0 - Day 60 (p < 0.05) clearly demonstrated that by practicing SSM, participants experienced improvement in the sleep quality. Compared with Day 0 assessment, participants demonstrated significant improvement in the quality of sleep on Day 16 and Day 60. The effect size of the intervention observed on the 16th day of assessment was small, but on the 60th day, a small to medium effect size of the intervention was observed. In the current study we found out that after practicing SSM for two months, participants reported a reduction in the perceived stress, they felt that they are more confident about their ability to handle personal problems, were able to cope with all the things that they had to do, felt that they were on top of the things, and felt less angered. Participants also reported that their overall sleep quality improved; they took less time to fall asleep; they had less disturbances in sleep and less daytime dysfunction due to sleep deprivation. The present study provides clear evidence of the efficacy and safety of non-pharmacological interventions such as SSM in reducing stress and improving sleep quality. Thus, ASTM may be considered a useful intervention to reduce psychological distress in healthy, non-clinical populations, and it can be an alternative remedy for treating poor sleep among individuals and decreasing the use of harmful sedatives.

Keywords: automatic self transcending meditation, Sahaj Samadhi meditation, sleep, stress

Procedia PDF Downloads 112
1239 Nuclear Near Misses and Their Learning for Healthcare

Authors: Nick Woodier, Iain Moppett

Abstract:

Background: It is estimated that one in ten patients admitted to hospital will suffer an adverse event in their care. While the majority of these will result in low harm, patients are being significantly harmed by the processes meant to help them. Healthcare, therefore, seeks to make improvements in patient safety by taking learning from other industries that are perceived to be more mature in their management of safety events. Of particular interest to healthcare are ‘near misses,’ those events that almost happened but for an intervention. Healthcare does not have any guidance as to how best to manage and learn from near misses to reduce the chances of harm to patients. The authors, as part of a larger study of near-miss management in healthcare, sought to learn from the UK nuclear sector to develop principles for how healthcare can identify, report, and learn from near misses to improve patient safety. The nuclear sector was chosen as an exemplar due to its status as an ultra-safe industry. Methods: A Grounded Theory (GT) methodology, augmented by a scoping review, was used. Data collection included interviews, scenario discussion, field notes, and the literature. The review protocol is accessible online. The GT aimed to develop theories about how nuclear manages near misses with a focus on defining them and clarifying how best to support reporting and analysis to extract learning. Near misses related to radiation release or exposure were focused on. Results: Eightnuclear interviews contributed to the GT across nuclear power, decommissioning, weapons, and propulsion. The scoping review identified 83 articles across a range of safety-critical industries, with only six focused on nuclear. The GT identified that nuclear has a particular focus on precursors and low-level events, with regulation supporting their management. Exploration of definitions led to the recognition of the importance of several interventions in a sequence of events, but that do not solely rely on humans as these cannot be assumed to be robust barriers. Regarding reporting and analysis, no consistent methods were identified, but for learning, the role of operating experience learning groups was identified as an exemplar. The safety culture across nuclear, however, was heard to vary, which undermined reporting of near misses and other safety events. Some parts of the industry described that their focus on near misses is new and that despite potential risks existing, progress to mitigate hazards is slow. Conclusions: Healthcare often sees ‘nuclear,’ as well as other ultra-safe industries such as ‘aviation,’ as homogenous. However, the findings here suggest significant differences in safety culture and maturity across various parts of the nuclear sector. Healthcare can take learning from some aspects of management of near misses in nuclear, such as how they are defined and how learning is shared through operating experience networks. However, healthcare also needs to recognise that variability exists across industries, and comparably, it may be more mature in some areas of safety.

Keywords: culture, definitions, near miss, nuclear safety, patient safety

Procedia PDF Downloads 82
1238 Nursing Students' Experience of Using Electronic Health Record System in Clinical Placements

Authors: Nurten Tasdemir, Busra Baloglu, Zeynep Cingoz, Can Demirel, Zeki Gezer, Barıs Efe

Abstract:

Student nurses are increasingly exposed to technology in the workplace after graduation with the growing numbers of electric health records (EHRs), handheld computers, barcode scanner medication dispensing systems, and automatic capture of patient data such as vital signs. Internationally, electronic health records (EHRs) systems are being implemented and evaluated. Students will inevitably encounter EHRs in the clinical learning environment and their professional practice. Nursing students must develop competency in the use of EHR. Aim: The study aimed to examine nursing students’ experiences of learning to use electronic health records (EHR) in clinical placements. Method: This study adopted a descriptive approach. The study population consisted of second and third-year nursing students at the Zonguldak School of Health in the West Black Sea Region of Turkey; the study was conducted during the 2015–2016 academic year. The sample consisted of 315 (74.1% of 425 students) nursing students who volunteered to participate. The students, who were involved in clinical practice, were invited to participate in the study Data were collected by a questionnaire designed by the researchers based on the relevant literature. Data were analyzed descriptively using the Statistical Package for Social Sciences (SPSS) for Windows version 16.0. The data are presented as means, standard deviations, and percentages. Approval for the study was obtained from the Ethical Committee of the University (Reg. Number: 29/03/2016/112) and the director of Nursing Department. Findings: A total of 315 students enrolled in this study, for a response rate of 74.1%. The mean age of the sample was 22.24 ± 1.37 (min: 19, max: 32) years, and most participants (79.7%) were female. Most of the nursing students (82.3%) stated that they use information technologies in clinical practice. Nearly half of the students (42.5%) reported that they have not accessed to EHR system. In addition, 61.6% of the students reported that insufficient computers available in clinical placement. Of the students, 84.7% reported that they prefer to have patient information from EHR system, and 63.8% of them found more effective to preparation for the clinical reporting. Conclusion: This survey indicated that nursing students experience to learn about EHR systems in clinical placements. For more effective learning environment nursing education should prepare nursing students for EHR systems in their educational life.

Keywords: electronic health record, clinical placement, nursing student, nursing education

Procedia PDF Downloads 259
1237 A Review of the Handling and Disposal of Botulinum Toxin in a Maxillofacial Unit

Authors: Ashana Gupta

Abstract:

Aim: In the UK, Botulinum Toxin (botox) is authorised for treating chronic myofascial pain secondary to masseter muscle hypertrophy (Fedorowicz et al. 2013). This audit aimed to ensure the Maxillofacial Unit is meeting the trust guidelines for the safe storage and disposal of botox. Method: The trust upholds a strict policy for botox handling. The audit was designed to optimise several elements including Staff awareness of regulations around botox handling A questionnaire was designed to test knowledge of advised storage temperatures, reporting of adverse events, disposal procedures and regulatory authorities. Steps taken to safely delivertoxin and eliminate unused toxin. A checklist was completed. These include marks for storagetemperature, identification checks, disposal of sharps, deactivation of toxin, and disposal. Results: All staff correctly stated storage requirements for toxin. 75% staff (n=8) were unsure about reporting and regulations. Whilst all staff knew how to dispose of vials, 0% staff showed awareness for the crucial step of deactivating toxin. All checklists (n=20) scored 100% for adequate storage, ID checks, and toxin disposal. However, there were no steps taken to deactivate toxin in any cases. Staff training took place with revision to clinical protocols. In line with Trust guidelines, an additional clinical step has been introduced including use of 0.5% sodium hypochlorite to deactivate botox. Conclusion: Deactivation is crucial to ensure residual toxin is not misused. There are cases of stolen botox within South-Tees Hospital (Woodcock, 2014). This audit was successful in increasing compliance to safe handling and disposal of botox by 100% and ensured our hospitalmeets Trust guidance.

Keywords: botulinum toxin, aesthetics, handling, disposal

Procedia PDF Downloads 167
1236 Implementing the Quality of Care Partnership to Reduce the Cost of Screenings for Sexually Transmitted Infections on a Southeastern College Campus

Authors: Amy Guidera, Steven Busby, Christian Williams, David Phillippi

Abstract:

College students are a priority preventative healthcare population that can engage in high-risk behaviors which may concurrently increase the potential for unsafe sexual practices, including contracting sexually transmitted infections (STIs). Early education, screening, treatment, and partner notification are important interventions for breaking the chain of transmission and recurrence in relation to preventing poor health outcomes and mitigating college dropout rates. The aim of this quality improvement project was to determine if the reduction in STI screening costs for college students (aged 18-30 years old) would increase the amount of STI screenings conducted at a university health center over the course of an academic semester while evaluating our ability to achieve an improved quality of care at a reduced cost, along with improved STI reporting and documentation. This study was conducted through retrospective chart reviews of STI-related visits and utilized the RADAR matrix to provide a guiding, iterative mechanism to continuously reassess goals and outcomes defined in a memorandum of agreement (MOA) between a university health center and the state department of health (DOH) laboratory. The project failed to increase the amount of STI screenings, most likely due to the emergence of COVID-19, but resulted in improved quality of care for students, improved STI-related visit documentation and reporting, and significantly reduced costs for STI screening for collegiate students at a southeastern private university campus.

Keywords: college health, college students, preventive health, reproductive health, sexually transmitted infections, young adults

Procedia PDF Downloads 114
1235 The Role of Situational Attribution Training in Reducing Automatic In-Group Stereotyping in Females

Authors: Olga Mironiuk, Małgorzata Kossowska

Abstract:

The aim of the present study was to investigate the influence of Situational Attribution Training on reducing automatic in-group stereotyping in females. The experiment was conducted with the control of age and level of prejudice. 90 female participants were randomly assigned to two conditions: experimental and control group (each group was also divided into younger- and older-aged condition). Participants from the experimental condition were subjected to more extensive training. In the first part of the experiment, the experimental group took part in the first session of Situational Attribution Training while the control group participated in the Grammatical Training Control. In the second part of the research both groups took part in the Situational Attribution Training (which was considered as the second training session for the experimental group and the first one for the control condition). The training procedure was based on the descriptions of ambiguous situations which could be explained using situational or dispositional attributions. The participant’s task was to choose the situational explanation from two alternatives, out of which the second one presented the explanation based on neutral or stereotypically associated with women traits. Moreover, the experimental group took part in the third training session after two- day time delay, in order to check the persistence of the training effect. The main hypothesis stated that among participants taking part in the more extensive training, the automatic in-group stereotyping would be less frequent after having finished training sessions. The effectiveness of the training was tested by measuring the response time and the correctness of answers: the longer response time for the examples where one of two possible answers was based on the stereotype trait and higher correctness of answers was considered to be a proof of the training effectiveness. As the participants’ level of prejudice was controlled (using the Ambivalent Sexism Inventory), it was also assumed that the training effect would be weaker for participants revealing a higher level of prejudice. The obtained results did not confirm the hypothesis based on the response time: participants from the experimental group responded faster in case of situations where one of the possible explanations was based on stereotype trait. However, an interesting observation was made during the analysis of the answers’ correctness: regardless the condition and age group affiliation, participants made more mistakes while choosing the situational explanations when the alternative was based on stereotypical trait associated with the dimension of warmth. What is more, the correctness of answers was higher in the third training session for the experimental group in case when the alternative of situational explanation was based on the stereotype trait associated with the dimension of competence. The obtained results partially confirm the effectiveness of the training.

Keywords: female, in-group stereotyping, prejudice, situational attribution training

Procedia PDF Downloads 158
1234 A First Step towards Automatic Evolutionary for Gas Lifts Allocation Optimization

Authors: Younis Elhaddad, Alfonso Ortega

Abstract:

Oil production by means of gas lift is a standard technique in oil production industry. To optimize the total amount of oil production in terms of the amount of gas injected is a key question in this domain. Different methods have been tested to propose a general methodology. Many of them apply well-known numerical methods. Some of them have taken into account the power of evolutionary approaches. Our goal is to provide the experts of the domain with a powerful automatic searching engine into which they can introduce their knowledge in a format close to the one used in their domain, and get solutions comprehensible in the same terms, as well. These proposals introduced in the genetic engine the most expressive formal models to represent the solutions to the problem. These algorithms have proven to be as effective as other genetic systems but more flexible and comfortable for the researcher although they usually require huge search spaces to justify their use due to the computational resources involved in the formal models. The first step to evaluate the viability of applying our approaches to this realm is to fully understand the domain and to select an instance of the problem (gas lift optimization) in which applying genetic approaches could seem promising. After analyzing the state of the art of this topic, we have decided to choose a previous work from the literature that faces the problem by means of numerical methods. This contribution includes details enough to be reproduced and complete data to be carefully analyzed. We have designed a classical, simple genetic algorithm just to try to get the same results and to understand the problem in depth. We could easily incorporate the well mathematical model, and the well data used by the authors and easily translate their mathematical model, to be numerically optimized, into a proper fitness function. We have analyzed the 100 curves they use in their experiment, similar results were observed, in addition, our system has automatically inferred an optimum total amount of injected gas for the field compatible with the addition of the optimum gas injected in each well by them. We have identified several constraints that could be interesting to incorporate to the optimization process but that could be difficult to numerically express. It could be interesting to automatically propose other mathematical models to fit both, individual well curves and also the behaviour of the complete field. All these facts and conclusions justify continuing exploring the viability of applying the approaches more sophisticated previously proposed by our research group.

Keywords: evolutionary automatic programming, gas lift, genetic algorithms, oil production

Procedia PDF Downloads 142
1233 Analysis of Urban Rail Transit Station's Accessibility Reliability: A Case Study of Hangzhou Metro, China

Authors: Jin-Qu Chen, Jie Liu, Yong Yin, Zi-Qi Ju, Yu-Yao Wu

Abstract:

Increase in travel fare and station’s failure will have huge impact on passengers’ travel. The Urban Rail Transit (URT) station’s accessibility reliability under increasing travel fare and station failure are analyzed in this paper. Firstly, the passenger’s travel path is resumed based on stochastic user equilibrium and Automatic Fare Collection (AFC) data. Secondly, calculating station’s importance by combining LeaderRank algorithm and Ratio of Station Affected Passenger Volume (RSAPV), and then the station’s accessibility evaluation indicators are proposed based on the analysis of passenger’s travel characteristic. Thirdly, station’s accessibility under different scenarios are measured and rate of accessibility change is proposed as station’s accessibility reliability indicator. Finally, the accessibility of Hangzhou metro stations is analyzed by the formulated models. The result shows that Jinjiang station and Liangzhu station are the most important and convenient station in the Hangzhou metro, respectively. Station failure and increase in travel fare and station failure have huge impact on station’s accessibility, except for increase in travel fare. Stations in Hangzhou metro Line 1 have relatively worse accessibility reliability and Fengqi Road station’s accessibility reliability is weakest. For Hangzhou metro operational department, constructing new metro line around Line 1 and protecting Line 1’s station preferentially can effective improve the accessibility reliability of Hangzhou metro.

Keywords: automatic fare collection data, AFC, station’s accessibility reliability, stochastic user equilibrium, urban rail transit, URT

Procedia PDF Downloads 109
1232 The Promotion of a Risk Culture: a Descriptive Study of Ghanaian Banks

Authors: Gerhard Grebe, Johan Marx

Abstract:

The aim of the study is to assess the state of operational risk management and the adoption of an appropriate risk culture in Ghanaian banks. The Bank of Ghana (BoG) joined the Basel Consultative Group (BCG) of the Basel Committee on Bank Supervision (BCBS) in 2021 and is proceeding with the implementation of the Basel III international regulatory framework for banks. The BoG’s Directive about risk management encourages, inter alia, the creation of an appropriate risk culture by Ghanaian banks. However, it is not evident how the risk management staff of Ghanaian banks experience the risk culture and the implementation of operational risk management in the banks where they are employed. Ghana is a developing economy, and it is addressing challenges with its organisational culture. According to Transparency International, successive Ghanaian governments claim to be fighting corruption, but little success has been achieved so far. This points to a possible lack of accountability, transparency, and integrity in the environment in which Ghanaian banks operate and which could influence their risk culture negatively. Purposive sampling was used for the survey, and the questionnaire was completed byGhanaian bank personnel who specializesin operational risk management, risk governance, and compliance, bank supervision, risk analyses, as well as the implementation of the operational risk management requirements of the Basel regulatory frameworks. The respondents indicated that they are fostering a risk culture and implementing monitoring and reporting procedures; the three lines of defence (3LOD); compliance; internal auditing; disclosure of operational risk information; and receiving guidance from the bank supervisor in an attempt to improve their operational risk management practices. However, the respondents reported the following challenges with staff members who are not inside the risk management departments(in order of priority), namelydemonstrating a risk culture, training and development; communication; reporting and disclosure; roles and responsibilities; performance appraisal; and technological and environmental barriers. Recommendations to address these challenges are provided

Keywords: ghana, operational risk, risk culture, risk management

Procedia PDF Downloads 87
1231 Population Dynamics and Land Use/Land Cover Change on the Chilalo-Galama Mountain Range, Ethiopia

Authors: Yusuf Jundi Sado

Abstract:

Changes in land use are mostly credited to human actions that result in negative impacts on biodiversity and ecosystem functions. This study aims to analyze the dynamics of land use and land cover changes for sustainable natural resources planning and management. Chilalo-Galama Mountain Range, Ethiopia. This study used Thematic Mapper 05 (TM) for 1986, 2001 and Landsat 8 (OLI) data 2017. Additionally, data from the Central Statistics Agency on human population growth were analyzed. Semi-Automatic classification plugin (SCP) in QGIS 3.2.3 software was used for image classification. Global positioning system, field observations and focus group discussions were used for ground verification. Land Use Land Cover (LU/LC) change analysis was using maximum likelihood supervised classification and changes were calculated for the 1986–2001 and the 2001–2017 and 1986-2017 periods. The results show that agricultural land increased from 27.85% (1986) to 44.43% and 51.32% in 2001 and 2017, respectively with the overall accuracies of 92% (1986), 90.36% (2001), and 88% (2017). On the other hand, forests decreased from 8.51% (1986) to 7.64 (2001) and 4.46% (2017), and grassland decreased from 37.47% (1986) to 15.22%, and 15.01% in 2001 and 2017, respectively. It indicates for the years 1986–2017 the largest area cover gain of agricultural land was obtained from grassland. The matrix also shows that shrubland gained land from agricultural land, afro-alpine, and forest land. Population dynamics is found to be one of the major driving forces for the LU/LU changes in the study area.

Keywords: Landsat, LU/LC change, Semi-Automatic classification plugin, population dynamics, Ethiopia

Procedia PDF Downloads 56
1230 Analysis of Unconditional Conservatism and Earnings Quality before and after the IFRS Adoption

Authors: Monica Santi, Evita Puspitasari

Abstract:

International Financial Reporting Standard (IFRS) has developed the principle based accounting standard. Based on this, IASB then eliminated the conservatism concept within accounting framework. Conservatism concept represents a prudent reaction to uncertainty to try to ensure that uncertainties and risk inherent in business situations are adequately considered. The conservatism concept has two ingredients: conditional conservatism or ex-post (news depending prudence) and unconditional conservatism or ex-ante (news-independent prudence). IFRS in substance disregards the unconditional conservatism because the unconditional conservatism can cause the understatement assets or overstated liabilities, and eventually the financial statement would be irrelevance since the information does not represent the real fact. Therefore, the IASB eliminate the conservatism concept. However, it does not decrease the practice of unconditional conservatism in the financial statement reporting. Therefore, we expected the earnings quality would be affected because of this situation, even though the IFRS implementation was expected to increase the earnings quality. The objective of this study was to provide empirical findings about the unconditional conservatism and the earnings quality before and after the IFRS adoption. The earnings per accrual measure were used as the proxy for the unconditional conservatism. If the earnings per accrual were negative (positive), it meant the company was classified as the conservative (not conservative). The earnings quality was defined as the ability of the earnings in reflecting the future earnings by considering the earnings persistence and stability. We used the earnings response coefficient (ERC) as the proxy for the earnings quality. ERC measured the extant of a security’s abnormal market return in response to the unexpected component of reporting earning of the firm issuing that security. The higher ERC indicated the higher earnings quality. The manufacturing companies listed in the Indonesian Stock Exchange (IDX) were used as the sample companies, and the 2009-2010 period was used to represent the condition before the IFRS adoption, and 2011-2013 was used to represent the condition after the IFRS adoption. Data was analyzed using the Mann-Whitney test and regression analysis. We used the firm size as the control variable with the consideration the firm size would affect the earnings quality of the company. This study had proved that the unconditional conservatism had not changed, either before and after the IFRS adoption period. However, we found the different findings for the earnings quality. The earnings quality had decreased after the IFRS adoption period. This empirical results implied that the earnings quality before the IFRS adoption was higher. This study also had found that the unconditional conservatism positively influenced the earnings quality insignificantly. The findings implied that the implementation of the IFRS had not decreased the unconditional conservatism practice and has not altered the earnings quality of the manufacturing company. Further, we found that the unconditional conservatism did not affect the earnings quality. Eventhough the empirical result shows that the unconditional conservatism gave positive influence to the earnings quality, but the influence was not significant. Thus, we concluded that the implementation of the IFRS did not increase the earnings quality.

Keywords: earnings quality, earnings response coefficient, IFRS Adoption, unconditional conservatism

Procedia PDF Downloads 239
1229 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 376
1228 Comparative Study of Conventional and Satellite Based Agriculture Information System

Authors: Rafia Hassan, Ali Rizwan, Sadaf Farhan, Bushra Sabir

Abstract:

The purpose of this study is to compare the conventional crop monitoring system with the satellite based crop monitoring system in Pakistan. This study is conducted for SUPARCO (Space and Upper Atmosphere Research Commission). The study focused on the wheat crop, as it is the main cash crop of Pakistan and province of Punjab. This study will answer the following: Which system is better in terms of cost, time and man power? The man power calculated for Punjab CRS is: 1,418 personnel and for SUPARCO: 26 personnel. The total cost calculated for SUPARCO is almost 13.35 million and CRS is 47.705 million. The man hours calculated for CRS (Crop Reporting Service) are 1,543,200 hrs (136 days) and man hours for SUPARCO are 8, 320hrs (40 days). It means that SUPARCO workers finish their work 96 days earlier than CRS workers. The results show that the satellite based crop monitoring system is efficient in terms of manpower, cost and time as compared to the conventional system, and also generates early crop forecasts and estimations. The research instruments used included: Interviews, physical visits, group discussions, questionnaires, study of reports and work flows. A total of 93 employees were selected using Yamane’s formula for data collection, which is done with the help questionnaires and interviews. Comparative graphing is used for the analysis of data to formulate the results of the research. The research findings also demonstrate that although conventional methods have a strong impact still in Pakistan (for crop monitoring) but it is the time to bring a change through technology, so that our agriculture will also be developed along modern lines.

Keywords: area frame, crop reporting service, CRS, sample frame, SRS/GIS, satellite remote sensing/ geographic information system

Procedia PDF Downloads 264
1227 Child Abuse: Emotional, Physical, Neglect, Sexual and the Psychological Effects: A Case Scenario in Lagos State

Authors: Aminu Ololade Matilda

Abstract:

Child abuse is a significant issue worldwide, affecting the socio-development and mental and physical health of young individuals. It is the maltreatment of a child by an adult or a child. This paper focuses on child abuse in Communities in Lagos State. The aim of this study is to investigate the extent of child abuse and its impact on the mood, social activities, self-worth, concentration, and academic performance of children in Communities in Lagos State. The primary research instrument used in this study was the interview (Forensic), which consisted of two sections. The first section gathered data on the details of the child and the forms and impacts of abuse experienced, while the second section focused on parental style. The study found that children who experienced various forms of abuse, such as emotional, neglect, physical, or sexual abuse, were hesitant to report it out of fear of threats or even death from the abuser. These abused children displayed withdrawn behaviour, depression, and low self-worth and underperformed academically compared to their peers who did not experience abuse. The findings align with socio-learning and intergenerational transmission of violence theories, which suggest that parents and caregivers who engage in child abuse often do so because they themselves experienced or witnessed abuse as children, thereby normalizing violence. The study highlights the prevalent issue of child abuse in Lagos State and emphasizes the need for advocacy programs and capacity building to raise awareness about child abuse and prevention. The distribution of the Child’s Rights Act in various sectors is also recommended to underscore the importance of protecting the rights of children. Additionally, the inclusion of courses on child abuse in the school curriculum is proposed to ensure children are educated on recognizing and reporting abuse.

Keywords: abuse, child, awareness, effects, emotional, neglect, physical, psychological, sexual, recognize, reporting, right

Procedia PDF Downloads 43
1226 Surveillance of Adverse Events Following Immunization during New Vaccines Introduction in Cameroon: A Cross-Sectional Study on the Role of Mobile Technology

Authors: Andreas Ateke Njoh, Shalom Tchokfe Ndoula, Amani Adidja, Germain Nguessan Menan, Annie Mengue, Eric Mboke, Hassan Ben Bachir, Sangwe Clovis Nchinjoh, Yauba Saidu, Laurent Cleenewerck De Kiev

Abstract:

Vaccines serve a great deal in protecting the population globally. Vaccine products are subject to rigorous quality control and approval before use to ensure safety. Even if all actors take the required precautions, some people could still have adverse events following immunization (AEFI) caused by the vaccine composition or an error in its administration. AEFI underreporting is pronounced in low-income settings like Cameroon. The Country introduced electronic platforms to strengthen surveillance. With the introduction of many novel vaccines, like COVID-19 and the novel Oral Polio Vaccine (nOPV) 2, there was a need to monitor AEFI in the Country. A cross-sectional study was conducted from July to December 2022. Data on AEFI per region of Cameroon were reviewed for the past five years. Data were analyzed with MS Excel, and the results were presented in proportions. AEFI reporting was uncommon in Cameroon. With the introduction of novel vaccines in 2021, the health authorities engaged in new tools and training to capture cases. AEFI detected almost doubled using the open data kit (ODK) compared to previous platforms, especially following the introduction of the nOPV2 and COVID-19 vaccines. The AEFI rate was 1.9 and 160 per administered 100 000 doses of nOPV2 and COVID-19 vaccines, respectively. This mobile tool captured individual information for people with AEFI from all regions. The platform helped to identify common AEFI following the use of these new vaccines. The ODK mobile technology was vital in improving AEFI reporting and providing data to monitor using new vaccines in Cameroon.

Keywords: adverse events following immunization, cameroon, COVID-19 vaccines, nOPV, ODK

Procedia PDF Downloads 54
1225 Statistical Feature Extraction Method for Wood Species Recognition System

Authors: Mohd Iz'aan Paiz Bin Zamri, Anis Salwa Mohd Khairuddin, Norrima Mokhtar, Rubiyah Yusof

Abstract:

Effective statistical feature extraction and classification are important in image-based automatic inspection and analysis. An automatic wood species recognition system is designed to perform wood inspection at custom checkpoints to avoid mislabeling of timber which will results to loss of income to the timber industry. The system focuses on analyzing the statistical pores properties of the wood images. This paper proposed a fuzzy-based feature extractor which mimics the experts’ knowledge on wood texture to extract the properties of pores distribution from the wood surface texture. The proposed feature extractor consists of two steps namely pores extraction and fuzzy pores management. The total number of statistical features extracted from each wood image is 38 features. Then, a backpropagation neural network is used to classify the wood species based on the statistical features. A comprehensive set of experiments on a database composed of 5200 macroscopic images from 52 tropical wood species was used to evaluate the performance of the proposed feature extractor. The advantage of the proposed feature extraction technique is that it mimics the experts’ interpretation on wood texture which allows human involvement when analyzing the wood texture. Experimental results show the efficiency of the proposed method.

Keywords: classification, feature extraction, fuzzy, inspection system, image analysis, macroscopic images

Procedia PDF Downloads 399