Search results for: integration features
5411 WhatsApp Application and Challenges of Radio Broadcasting in Northern Nigeria: Special Interest on FRCN Kaduna
Authors: Aliyu Damri
Abstract:
This study analyzed the emergence of WhatsApp and how employees at the Federal Radio Corporation of Nigeria, Kaduna defined the concept base on their vast broadcasting experiences for over five decades and application of the phenomenon to the radio station. It also analyzed the nature, patterns, dimensions, features, challenges as well as the effects of WhatsApp as a social networking site with specific interest on the radio outlet. Also, the study identified how the radio organization responded to the challenges in an attempt to adapt to the new pattern of broadcasting characterized by many technological transformations. The study further explained in details such skills journalists need to function optimally using WhatsApp as well as the impacts of the WhatsApp on radio broadcasting. It used a combination of published materials, focus group discussion, in depth interviews and participant observation on the activities of the radio stations to address the research questions. The data generated provided insight to better understand the challenges posed to FRCN Kaduna as a result of WhatsApp application and how FRCN Kaduna responded to the challenges. It also provided information on the skills journalists need to function optimally in using WhatsApp application in the radio station. The interview and focus group discussion’s transcripts and the published materials were analyzed along thematic pattern related to the research questions in the study. The dominant response relied heavily on change in the radio station’s organizational and technical integration of newsrooms, the use of a multiskilled workforce, application of a flexible and user-friendly technology in all aspects of production, expansion of the station’s services in to new media such as internet and mobile phones as well as sharing of ideas across different units in the radio outfit.Keywords: broadcasting, challenge, northern Nigeria, radio, WhatsApp application
Procedia PDF Downloads 1295410 Some Imaginative Geomorphosites in Malaysia: Study on Their Formations and Geotourism Potentials
Authors: Dony Adriansyah Nazaruddin, Mohammad Muqtada Ali Khan
Abstract:
This paper aims to present some imaginative geomorphological sites in Malaysia. This study comprises desk study and field study. Desk study was conducted by reviewing some literatures related to the topic and some geomorphosites in Malaysia. Field study was organized in 2013 and 2014 to investigate the recent situation of these sites and to take some measurements, photographs and rock samples. Some examples of imaginative geomorphosites all over Malaysia have been identified for this purpose. In Peninsular Malaysia, some geomorphosites in Langkawi Islands (the state of Kedah) have imaginative features such as a “turtle” atop the limestone hill of Setul Formation at the Kilim Geoforest Park, a “shoe” at the Kasut island of the Kilim Geoforest Park, a “lying pregnant lady” at the Dayang Bunting island of the Dayang Bunting Marble Geoforest Park, and a “ship” of the Singa Kecil island. Meanwhile, some other examples are from the state of Kelantan, such as a mogote hill with a “human face looking upward” at Gunung Reng, Jeli District and a “boat rock” at Mount Chamah, Gua Musang District. In East Malaysia, there is only one example can be identified, it is the “Abraham Lincoln’s face” at the Deer Cave, Gunung Mulu National Park, Sarawak. Karst landforms dominate the imaginative geomorphosites in Malaysia. The formations of these features are affected by some endogenic and exogenic processes, such as tectonic uplift, weathering (including solution), erosion, and so on. This study will recommend that these imaginative features should be conserved and developed for some purposes, such as research, education, and geotourism development in Malaysia.Keywords: geomorphosite, geotourism, earth processes, karst landforms, Malaysia
Procedia PDF Downloads 6265409 Best-Performing Color Space for Land-Sea Segmentation Using Wavelet Transform Color-Texture Features and Fusion of over Segmentation
Authors: Seynabou Toure, Oumar Diop, Kidiyo Kpalma, Amadou S. Maiga
Abstract:
Color and texture are the two most determinant elements for perception and recognition of the objects in an image. For this reason, color and texture analysis find a large field of application, for example in image classification and segmentation. But, the pioneering work in texture analysis was conducted on grayscale images, thus discarding color information. Many grey-level texture descriptors have been proposed and successfully used in numerous domains for image classification: face recognition, industrial inspections, food science medical imaging among others. Taking into account color in the definition of these descriptors makes it possible to better characterize images. Color texture is thus the subject of recent work, and the analysis of color texture images is increasingly attracting interest in the scientific community. In optical remote sensing systems, sensors measure separately different parts of the electromagnetic spectrum; the visible ones and even those that are invisible to the human eye. The amounts of light reflected by the earth in spectral bands are then transformed into grayscale images. The primary natural colors Red (R) Green (G) and Blue (B) are then used in mixtures of different spectral bands in order to produce RGB images. Thus, good color texture discrimination can be achieved using RGB under controlled illumination conditions. Some previous works investigate the effect of using different color space for color texture classification. However, the selection of the best performing color space in land-sea segmentation is an open question. Its resolution may bring considerable improvements in certain applications like coastline detection, where the detection result is strongly dependent on the performance of the land-sea segmentation. The aim of this paper is to present the results of a study conducted on different color spaces in order to show the best-performing color space for land-sea segmentation. In this sense, an experimental analysis is carried out using five different color spaces (RGB, XYZ, Lab, HSV, YCbCr). For each color space, the Haar wavelet decomposition is used to extract different color texture features. These color texture features are then used for Fusion of Over Segmentation (FOOS) based classification; this allows segmentation of the land part from the sea one. By analyzing the different results of this study, the HSV color space is found as the best classification performance while using color and texture features; which is perfectly coherent with the results presented in the literature.Keywords: classification, coastline, color, sea-land segmentation
Procedia PDF Downloads 2475408 Integrating Islamic Finance Principles with Environmental, Social, and Governance Criteria: A Bibliometric Analysis of Global Trends and Impact Within the 2030 Agenda
Authors: Paolo Biancone, Silvana Secinaro, Davide Calandra
Abstract:
This study explores the integration of Islamic finance principles with environmental, social, and governance (ESG) criteria, focusing on the contribution of Islamic financial instruments to achieving sustainable development goals (SDGs). Through a systematic literature review (SLR) and bibliometric analysis of 66 documents from 2019 to 2024, the research addresses critical gaps by examining the alignment between Islamic finance and ESG, identifying emerging trends, and assessing operational challenges and opportunities. Findings indicate that Islamic finance, mainly through instruments such as green sukuk and Islamic microfinance, demonstrates substantial alignment with ESG objectives, anchored in its ethical principles of risk-sharing, fairness, and avoidance of harmful investments. Nevertheless, scalability and regulatory structures pose significant challenges to broader ESG adoption within Islamic finance. This study offers theoretical and practical implications, proposing that Islamic finance provides a solid framework to address sustainability shortcomings in conventional finance. Furthermore, it highlights future research directions, emphasizing the need for empirical studies on the long-term impact of Islamic financial products on sustainability outcomes and exploring the role of fintech in ESG integration.Keywords: Islamic finance, ESG, SDGs, bibliometric analysis, Shariah-compliant investments
Procedia PDF Downloads 95407 Technological Innovations as a Potential Vehicle for Supply Chain Integration on Basic Metal Industries
Authors: Alie Wube Dametew, Frank Ebinger
Abstract:
This study investigated the roles of technological innovation on basic metal industries and then developed technological innovation framework for enhancing sustainable competitive advantage in the basic metal industries. The previous research work indicates that technological innovation has critical impact in promoting local industries to improve their performance and achieve sustainable competitive environments. The filed observation, questioner and expert interview result from basic metal industries indicate that the technological capability of local industries to invention, adoption, modification, improving and use a given innovative technology is very poor. As the result, this poor technological innovation was occurred due to improper innovation and technology transfer framework, non-collaborative operating environment between foreign and local industries, very weak national technology policies, problems research and innovation centers, the common miss points on basic metal industry innovation systems were investigated in this study. One of the conclusions of the article is that, through using the developed technological innovation framework in this study, basic metal industries improve innovation process and support an innovative culture for sector capabilities and achieve sustainable competitive advantage.Keywords: technological innovation, competitive advantage, sustainable, basic metal industry, conceptual model, sustainability, supply chain integration
Procedia PDF Downloads 2455406 Biologic Materials- Ecological Living Network
Authors: Ina Dajci
Abstract:
Biologic Materials presents groundbreaking transdisciplinary research aimed at fostering new collaborative models across the Built Environment, Forestry, and Agriculture sectors. This initiative seeks to establish innovative paradigms for local and global material flows by developing a biocompatible, regenerative material economy. The project focuses on creating materials derived from biowaste and silvicultural practices, ensuring the preservation of endangered indigenous and vernacular techniques through the integration of emerging biosciences. By utilizing biomaterials sourced from agricultural waste and forest byproducts, the initiative incorporates fabrication methods recognized by UNESCO as ‘intangible cultural heritage of humanity,’ which are currently at risk. The structural, mechanical, and environmental properties of these materials are enhanced through advanced CAD-CAM fabrication, along with energy-efficient biochemical and bacterial processes that promote healthy indigo coloration. Furthermore, the integration of AI technologies in species selection facilitates a novel partnership model, enabling designers to collaborate effectively with forest managers and silviculture practitioners. This collaborative approach not only optimizes the use of plant-based materials but also enhances biodiversity and climate resilience in regional ecosystems. Overall, this project embodies a holistic strategy for addressing environmental challenges while revitalizing traditional practices and fostering sustainable innovation.Keywords: material, architecture, culture, heritage, ecology, environment
Procedia PDF Downloads 105405 Design and Realization of Computer Network Security Perception Control System
Authors: El Miloudi Djelloul
Abstract:
Based on analysis on applications by perception control technology in computer network security status and security protection measures, from the angles of network physical environment and network software system environmental security, this paper provides network security system perception control solution using Internet of Things (IOT), telecom and other perception technologies. Security Perception Control System is in the computer network environment, utilizing Radio Frequency Identification (RFID) of IOT and telecom integration technology to carry out integration design for systems. In the network physical security environment, RFID temperature, humidity, gas and perception technologies are used to do surveillance on environmental data, dynamic perception technology is used for network system security environment, user-defined security parameters, security log are used for quick data analysis, extends control on I/O interface, by development of API and AT command, Computer Network Security Perception Control based on Internet and GSM/GPRS is achieved, which enables users to carry out interactive perception and control for network security environment by WEB, E-MAIL as well as PDA, mobile phone short message and Internet. In the system testing, through middle ware server, security information data perception in real time with deviation of 3-5% was achieved; it proves the feasibility of Computer Network Security Perception Control System.Keywords: computer network, perception control system security strategy, Radio Frequency Identification (RFID)
Procedia PDF Downloads 4465404 Measuring the Resilience of e-Governments Using an Ontology
Authors: Onyekachi Onwudike, Russell Lock, Iain Phillips
Abstract:
The variability that exists across governments, her departments and the provisioning of services has been areas of concern in the E-Government domain. There is a need for reuse and integration across government departments which are accompanied by varying degrees of risks and threats. There is also the need for assessment, prevention, preparation, response and recovery when dealing with these risks or threats. The ability of a government to cope with the emerging changes that occur within it is known as resilience. In order to forge ahead with concerted efforts to manage reuse and integration induced risks or threats to governments, the ambiguities contained within resilience must be addressed. Enhancing resilience in the E-Government domain is synonymous with reducing risks governments face with provisioning of services as well as reuse of components across departments. Therefore, it can be said that resilience is responsible for the reduction in government’s vulnerability to changes. In this paper, we present the use of the ontology to measure the resilience of governments. This ontology is made up of a well-defined construct for the taxonomy of resilience. A specific class known as ‘Resilience Requirements’ is added to the ontology. This class embraces the concept of resilience into the E-Government domain ontology. Considering that the E-Government domain is a highly complex one made up of different departments offering different services, the reliability and resilience of the E-Government domain have become more complex and critical to understand. We present questions that can help a government access how prepared they are in the face of risks and what steps can be taken to recover from them. These questions can be asked with the use of queries. The ontology focuses on developing a case study section that is used to explore ways in which government departments can become resilient to the different kinds of risks and threats they may face. A collection of resilience tools and resources have been developed in our ontology to encourage governments to take steps to prepare for emergencies and risks that a government may face with the integration of departments and reuse of components across government departments. To achieve this, the ontology has been extended by rules. We present two tools for understanding resilience in the E-Government domain as a risk analysis target and the output of these tools when applied to resilience in the E-Government domain. We introduce the classification of resilience using the defined taxonomy and modelling of existent relationships based on the defined taxonomy. The ontology is constructed on formal theory and it provides a semantic reference framework for the concept of resilience. Key terms which fall under the purview of resilience with respect to E-Governments are defined. Terms are made explicit and the relationships that exist between risks and resilience are made explicit. The overall aim of the ontology is to use it within standards that would be followed by all governments for government-based resilience measures.Keywords: E-Government, Ontology, Relationships, Resilience, Risks, Threats
Procedia PDF Downloads 3375403 Image Retrieval Based on Multi-Feature Fusion for Heterogeneous Image Databases
Authors: N. W. U. D. Chathurani, Shlomo Geva, Vinod Chandran, Proboda Rajapaksha
Abstract:
Selecting an appropriate image representation is the most important factor in implementing an effective Content-Based Image Retrieval (CBIR) system. This paper presents a multi-feature fusion approach for efficient CBIR, based on the distance distribution of features and relative feature weights at the time of query processing. It is a simple yet effective approach, which is free from the effect of features' dimensions, ranges, internal feature normalization and the distance measure. This approach can easily be adopted in any feature combination to improve retrieval quality. The proposed approach is empirically evaluated using two benchmark datasets for image classification (a subset of the Corel dataset and Oliva and Torralba) and compared with existing approaches. The performance of the proposed approach is confirmed with the significantly improved performance in comparison with the independently evaluated baseline of the previously proposed feature fusion approaches.Keywords: feature fusion, image retrieval, membership function, normalization
Procedia PDF Downloads 3455402 Optimized Weight Selection of Control Data Based on Quotient Space of Multi-Geometric Features
Authors: Bo Wang
Abstract:
The geometric processing of multi-source remote sensing data using control data of different scale and different accuracy is an important research direction of multi-platform system for earth observation. In the existing block bundle adjustment methods, as the controlling information in the adjustment system, the approach using single observation scale and precision is unable to screen out the control information and to give reasonable and effective corresponding weights, which reduces the convergence and adjustment reliability of the results. Referring to the relevant theory and technology of quotient space, in this project, several subjects are researched. Multi-layer quotient space of multi-geometric features is constructed to describe and filter control data. Normalized granularity merging mechanism of multi-layer control information is studied and based on the normalized scale factor, the strategy to optimize the weight selection of control data which is less relevant to the adjustment system can be realized. At the same time, geometric positioning experiment is conducted using multi-source remote sensing data, aerial images, and multiclass control data to verify the theoretical research results. This research is expected to break through the cliché of the single scale and single accuracy control data in the adjustment process and expand the theory and technology of photogrammetry. Thus the problem to process multi-source remote sensing data will be solved both theoretically and practically.Keywords: multi-source image geometric process, high precision geometric positioning, quotient space of multi-geometric features, optimized weight selection
Procedia PDF Downloads 2845401 Features of Normative and Pathological Realizations of Sibilant Sounds for Computer-Aided Pronunciation Evaluation in Children
Authors: Zuzanna Miodonska, Michal Krecichwost, Pawel Badura
Abstract:
Sigmatism (lisping) is a speech disorder in which sibilant consonants are mispronounced. The diagnosis of this phenomenon is usually based on the auditory assessment. However, the progress in speech analysis techniques creates a possibility of developing computer-aided sigmatism diagnosis tools. The aim of the study is to statistically verify whether specific acoustic features of sibilant sounds may be related to pronunciation correctness. Such knowledge can be of great importance while implementing classifiers and designing novel tools for automatic sibilants pronunciation evaluation. The study covers analysis of various speech signal measures, including features proposed in the literature for the description of normative sibilants realization. Amplitudes and frequencies of three fricative formants (FF) are extracted based on local spectral maxima of the friction noise. Skewness, kurtosis, four normalized spectral moments (SM) and 13 mel-frequency cepstral coefficients (MFCC) with their 1st and 2nd derivatives (13 Delta and 13 Delta-Delta MFCC) are included in the analysis as well. The resulting feature vector contains 51 measures. The experiments are performed on the speech corpus containing words with selected sibilant sounds (/ʃ, ʒ/) pronounced by 60 preschool children with proper pronunciation or with natural pathologies. In total, 224 /ʃ/ segments and 191 /ʒ/ segments are employed in the study. The Mann-Whitney U test is employed for the analysis of stigmatism and normative pronunciation. Statistically, significant differences are obtained in most of the proposed features in children divided into these two groups at p < 0.05. All spectral moments and fricative formants appear to be distinctive between pathology and proper pronunciation. These metrics describe the friction noise characteristic for sibilants, which makes them particularly promising for the use in sibilants evaluation tools. Correspondences found between phoneme feature values and an expert evaluation of the pronunciation correctness encourage to involve speech analysis tools in diagnosis and therapy of sigmatism. Proposed feature extraction methods could be used in a computer-assisted stigmatism diagnosis or therapy systems.Keywords: computer-aided pronunciation evaluation, sigmatism diagnosis, speech signal analysis, statistical verification
Procedia PDF Downloads 3015400 Automatic Classification of the Stand-to-Sit Phase in the TUG Test Using Machine Learning
Authors: Yasmine Abu Adla, Racha Soubra, Milana Kasab, Mohamad O. Diab, Aly Chkeir
Abstract:
Over the past several years, researchers have shown a great interest in assessing the mobility of elderly people to measure their functional status. Usually, such an assessment is done by conducting tests that require the subject to walk a certain distance, turn around, and finally sit back down. Consequently, this study aims to provide an at home monitoring system to assess the patient’s status continuously. Thus, we proposed a technique to automatically detect when a subject sits down while walking at home. In this study, we utilized a Doppler radar system to capture the motion of the subjects. More than 20 features were extracted from the radar signals, out of which 11 were chosen based on their intraclass correlation coefficient (ICC > 0.75). Accordingly, the sequential floating forward selection wrapper was applied to further narrow down the final feature vector. Finally, 5 features were introduced to the linear discriminant analysis classifier, and an accuracy of 93.75% was achieved as well as a precision and recall of 95% and 90%, respectively.Keywords: Doppler radar system, stand-to-sit phase, TUG test, machine learning, classification
Procedia PDF Downloads 1615399 Developed CNN Model with Various Input Scale Data Evaluation for Bearing Faults Prognostics
Authors: Anas H. Aljemely, Jianping Xuan
Abstract:
Rolling bearing fault diagnosis plays a pivotal issue in the rotating machinery of modern manufacturing. In this research, a raw vibration signal and improved deep learning method for bearing fault diagnosis are proposed. The multi-dimensional scales of raw vibration signals are selected for evaluation condition monitoring system, and the deep learning process has shown its effectiveness in fault diagnosis. In the proposed method, employing an Exponential linear unit (ELU) layer in a convolutional neural network (CNN) that conducts the identical function on positive data, an exponential nonlinearity on negative inputs, and a particular convolutional operation to extract valuable features. The identification results show the improved method has achieved the highest accuracy with a 100-dimensional scale and increase the training and testing speed.Keywords: bearing fault prognostics, developed CNN model, multiple-scale evaluation, deep learning features
Procedia PDF Downloads 2105398 Calibration of Residential Buildings Energy Simulations Using Real Data from an Extensive in situ Sensor Network – A Study of Energy Performance Gap
Authors: Mathieu Bourdeau, Philippe Basset, Julien Waeytens, Elyes Nefzaoui
Abstract:
As residential buildings account for a third of the overall energy consumption and greenhouse gas emissions in Europe, building energy modeling is an essential tool to reach energy efficiency goals. In the energy modeling process, calibration is a mandatory step to obtain accurate and reliable energy simulations. Nevertheless, the comparison between simulation results and the actual building energy behavior often highlights a significant performance gap. The literature discusses different origins of energy performance gaps, from building design to building operation. Then, building operation description in energy models, especially energy usages and users’ behavior, plays an important role in the reliability of simulations but is also the most accessible target for post-occupancy energy management and optimization. Therefore, the present study aims to discuss results on the calibration ofresidential building energy models using real operation data. Data are collected through a sensor network of more than 180 sensors and advanced energy meters deployed in three collective residential buildings undergoing major retrofit actions. The sensor network is implemented at building scale and in an eight-apartment sample. Data are collected for over one year and half and coverbuilding energy behavior – thermal and electricity, indoor environment, inhabitants’ comfort, occupancy, occupants behavior and energy uses, and local weather. Building energy simulations are performed using a physics-based building energy modeling software (Pleaides software), where the buildings’features are implemented according to the buildingsthermal regulation code compliance study and the retrofit project technical files. Sensitivity analyses are performed to highlight the most energy-driving building features regarding each end-use. These features are then compared with the collected post-occupancy data. Energy-driving features are progressively replaced with field data for a step-by-step calibration of the energy model. Results of this study provide an analysis of energy performance gap on an existing residential case study under deep retrofit actions. It highlights the impact of the different building features on the energy behavior and the performance gap in this context, such as temperature setpoints, indoor occupancy, the building envelopeproperties but also domestic hot water usage or heat gains from electric appliances. The benefits of inputting field data from an extensive instrumentation campaign instead of standardized scenarios are also described. Finally, the exhaustive instrumentation solution provides useful insights on the needs, advantages, and shortcomings of the implemented sensor network for its replicability on a larger scale and for different use cases.Keywords: calibration, building energy modeling, performance gap, sensor network
Procedia PDF Downloads 1595397 Overview of Multi-Chip Alternatives for 2.5 and 3D Integrated Circuit Packagings
Authors: Ching-Feng Chen, Ching-Chih Tsai
Abstract:
With the size of the transistor gradually approaching the physical limit, it challenges the persistence of Moore’s Law due to the development of the high numerical aperture (high-NA) lithography equipment and other issues such as short channel effects. In the context of the ever-increasing technical requirements of portable devices and high-performance computing, relying on the law continuation to enhance the chip density will no longer support the prospects of the electronics industry. Weighing the chip’s power consumption-performance-area-cost-cycle time to market (PPACC) is an updated benchmark to drive the evolution of the advanced wafer nanometer (nm). The advent of two and half- and three-dimensional (2.5 and 3D)- Very-Large-Scale Integration (VLSI) packaging based on Through Silicon Via (TSV) technology has updated the traditional die assembly methods and provided the solution. This overview investigates the up-to-date and cutting-edge packaging technologies for 2.5D and 3D integrated circuits (ICs) based on the updated transistor structure and technology nodes. The author concludes that multi-chip solutions for 2.5D and 3D IC packagings are feasible to prolong Moore’s Law.Keywords: moore’s law, high numerical aperture, power consumption-performance-area-cost-cycle time to market, 2.5 and 3D- very-large-scale integration, packaging, through silicon via
Procedia PDF Downloads 1145396 Response of First Bachelor of Medicine, Bachelor of Surgery (MBBS) Students to Integrated Learning Program
Authors: Raveendranath Veeramani, Parkash Chand, H. Y. Suma, A. Umamageswari
Abstract:
Background and Aims: The aim of this study was to evaluate students’ perception of Integrated Learning Program[ILP]. Settings and Design: A questionnaire was used to survey and evaluate the perceptions of 1styear MBBS students at the Department of Anatomy at our medical college in India. Materials and Methods: The first MBBS Students of Anatomy were involved in the ILP on the Liver and extra hepatic biliary apparatus integrating the Departments of Anatomy, Biochemistry and Hepato-biliary Surgery. The evaluation of the ILP was done by two sets of short questionnaire that had ten items using the Likert five-point grading scale. The data involved both the students’ responses and their grading. Results: A majority of students felt that the ILP was better in as compared to the traditional lecture method of teaching.The integrated teaching method was better at fulfilling learning objectives (128 students, 83%), enabled better understanding (students, 94%), were more interesting (140 students, 90%), ensured that they could score better in exams (115 students, 77%) and involved greater interaction (100 students, 66%), as compared to traditional teaching methods. Most of the students (142 students, 95%) opined that more such sessions should be organized in the future. Conclusions: Responses from students show that the integrated learning session should be incorporated even at first phase of MBBS for selected topics so as to create interest in the medical sciences at the entry level and to make them understand the importance of basic science.Keywords: integrated learning, students response, vertical integration, horizontal integration
Procedia PDF Downloads 2015395 Family Planning and HIV Integration: A One-stop Shop Model at Spilhaus Clinic, Harare Zimbabwe
Authors: Mercy Marimirofa, Farai Machinga, Alfred Zvoushe, Tsitsidzaishe Musvosvi
Abstract:
The Government of Zimbabwe embarked on integrating family planning with Sexually Transmitted Infection (STI) and Human Immunodeficiency Virus (HIV) services in May 2020 with support from the World Health Organization (WHO). There was high HIV prevalence, incidence rates and STI infections among women attending FP clinics. Spilhaus is a specialized center of excellence clinic which offers a range of sexual reproductive health services. HIV services were limited to testing only, and clients were referred to other facilities for further management. Integration of services requires that all the services be available at one point so that clients will access them during their visit to the facility. Objectives: The study was conducted to assess the impact the one-stop-shop model has made in accessing integrated Family Planning services and sexual reproductive health services compared to the supermarket approach. It also assessed the relationship family planning services have with other sexual reproductive health services. Methods: A secondary data analysis was conducted at Spilhaus clinic in Harare using family planning registers and HIV services registers comparing years 2019 and 2021. A 2 sample t-test was used to determine the difference in clients accessing the services under the two models. A Spearman’s rank correlation was used to determine if accessing family planning services has a relationship with other sexual reproductive health services. Results: In 2019, 7,548 clients visited the Spilhaus clinic compared to 8,265 during the period January to December 2021. The median age for all clients accessing services was 32 years. An increase of 69% in the number of services accessed was recorded from 2019 to 2021. More services were accessed in 2021. There was no difference in the number of clients accessing family planning services cervical cancer, and HIV services. A difference was found in the number of clients who were offered STI screening services. There was also a relationship between accessing family planning services and STI screening services (ρ = 0.729, p-value=0.006). Conclusion: Programming towards SRH services was a great achievement, the use of an integrated approach proved to be cost-effective as it minimised the required resources for separate programs. Clients accessed important health needs at once. The integration of these services provided an opportunity to offer comprehensive information which addressed an individual’s sexual reproductive health needs.Keywords: intergration, one stop shop, family planning, reproductive health
Procedia PDF Downloads 685394 Analysis of Different Resins in Web-to-Flange Joints
Authors: W. F. Ribeiro, J. L. N. Góes
Abstract:
The industrial process adds to engineering wood products features absent in solid wood, with homogeneous structure and reduced defects, improved physical and mechanical properties, bio-deterioration, resistance and better dimensional stability, improving quality and increasing the reliability of structures wood. These features combined with using fast-growing trees, make them environmentally ecological products, ensuring a strong consumer market. The wood I-joists are manufactured by the industrial profiles bonding flange and web, an important aspect of the production of wooden I-beams is the adhesive joint that bonds the web to the flange. Adhesives can effectively transfer and distribute stresses, thereby increasing the strength and stiffness of the composite. The objective of this study is to evaluate different resins in a shear strain specimens with the aim of analyzing the most efficient resin and possibility of using national products, reducing the manufacturing cost. First was conducted a literature review, where established the geometry and materials generally used, then established and analyzed 8 national resins and produced six specimens for each.Keywords: engineered wood products, structural resin, wood i-joist, Pinus taeda
Procedia PDF Downloads 2785393 YOLO-IR: Infrared Small Object Detection in High Noise Images
Authors: Yufeng Li, Yinan Ma, Jing Wu, Chengnian Long
Abstract:
Infrared object detection aims at separating small and dim target from clutter background and its capabilities extend beyond the limits of visible light, making it invaluable in a wide range of applications such as improving safety, security, efficiency, and functionality. However, existing methods are usually sensitive to the noise of the input infrared image, leading to a decrease in target detection accuracy and an increase in the false alarm rate in high-noise environments. To address this issue, an infrared small target detection algorithm called YOLO-IR is proposed in this paper to improve the robustness to high infrared noise. To address the problem that high noise significantly reduces the clarity and reliability of target features in infrared images, we design a soft-threshold coordinate attention mechanism to improve the model’s ability to extract target features and its robustness to noise. Since the noise may overwhelm the local details of the target, resulting in the loss of small target features during depth down-sampling, we propose a deep and shallow feature fusion neck to improve the detection accuracy. In addition, because the generalized Intersection over Union (IoU)-based loss functions may be sensitive to noise and lead to unstable training in high-noise environments, we introduce a Wasserstein-distance based loss function to improve the training of the model. The experimental results show that YOLO-IR achieves a 5.0% improvement in recall and a 6.6% improvement in F1-score over existing state-of-art model.Keywords: infrared small target detection, high noise, robustness, soft-threshold coordinate attention, feature fusion
Procedia PDF Downloads 735392 Design of an Air and Land Multi-Element Expression Pattern of Navigation Electronic Map for Ground Vehicles under United Navigation Mechanism
Authors: Rui Liu, Pengyu Cui, Nan Jiang
Abstract:
At present, there is much research on the application of centralized management and cross-integration application of basic geographic information. However, the idea of information integration and sharing between land, sea, and air navigation targets is not deeply applied into the research of navigation information service, especially in the information expression. Targeting at this problem, the paper carries out works about the expression pattern of navigation electronic map for ground vehicles under air and land united navigation mechanism. At first, with the support from multi-source information fusion of GIS vector data, RS data, GPS data, etc., an air and land united information expression pattern is designed aiming at specific navigation task of emergency rescue in the earthquake. And then, the characteristics and specifications of the united expression of air and land navigation information under the constraints of map load are summarized and transferred into expression rules in the rule bank. At last, the related navigation experiment is implemented to evaluate the effect of the expression pattern. The experiment selects evaluation factors of the navigation task accomplishment time and the navigation error rate as the main index, and make comparisons with the traditional single information expression pattern. To sum up, the research improved the theory of navigation electronic map and laid a certain foundation for the design and realization of united navigation system in the aspect of real-time navigation information delivery.Keywords: navigation electronic map, united navigation, multi-element expression pattern, multi-source information fusion
Procedia PDF Downloads 1995391 Foreseen the Future: Human Factors Integration in European Horizon Projects
Authors: José Manuel Palma, Paula Pereira, Margarida Tomás
Abstract:
Foreseen the future: Human factors integration in European Horizon Projects The development of new technology as artificial intelligence, smart sensing, robotics, cobotics or intelligent machinery must integrate human factors to address the need to optimize systems and processes, thereby contributing to the creation of a safe and accident-free work environment. Human Factors Integration (HFI) consistently pose a challenge for organizations when applied to daily operations. AGILEHAND and FORTIS projects are grounded in the development of cutting-edge technology - industry 4.0 and 5.0. AGILEHAND aims to create advanced technologies for autonomously sort, handle, and package soft and deformable products, whereas FORTIS focuses on developing a comprehensive Human-Robot Interaction (HRI) solution. Both projects employ different approaches to explore HFI. AGILEHAND is mainly empirical, involving a comparison between the current and future work conditions reality, coupled with an understanding of best practices and the enhancement of safety aspects, primarily through management. FORTIS applies HFI throughout the project, developing a human-centric approach that includes understanding human behavior, perceiving activities, and facilitating contextual human-robot information exchange. it intervention is holistic, merging technology with the physical and social contexts, based on a total safety culture model. In AGILEHAND we will identify safety emergent risks, challenges, their causes and how to overcome them by resorting to interviews, questionnaires, literature review and case studies. Findings and results will be presented in “Strategies for Workers’ Skills Development, Health and Safety, Communication and Engagement” Handbook. The FORTIS project will implement continuous monitoring and guidance of activities, with a critical focus on early detection and elimination (or mitigation) of risks associated with the new technology, as well as guidance to adhere correctly with European Union safety and privacy regulations, ensuring HFI, thereby contributing to an optimized safe work environment. To achieve this, we will embed safety by design, and apply questionnaires, perform site visits, provide risk assessments, and closely track progress while suggesting and recommending best practices. The outcomes of these measures will be compiled in the project deliverable titled “Human Safety and Privacy Measures”. These projects received funding from European Union’s Horizon 2020/Horizon Europe research and innovation program under grant agreement No101092043 (AGILEHAND) and No 101135707 (FORTIS).Keywords: human factors integration, automation, digitalization, human robot interaction, industry 4.0 and 5.0
Procedia PDF Downloads 645390 Kocuria Keratitis: A Rare and Diagnostically Challenging Infection of the Cornea
Authors: Sarah Jacqueline Saram, Diya Baker, Jaishree Gandhewar
Abstract:
Named after the Slovakian microbiologist, Miroslav Kocur, the Kocuria spp. are an emerging cause of significant human infections. Their predilection for immunocompromised states, such as malignancy and metabolic disorders, is highlighted in the literature. The coagulase-negative, gram-positive cocci are commensals found in the skin and oropharynx of humans, and their growing presence as responsible organisms in ocular infections cannot be ignored. The severe, rapid, and unrelenting disease course associated with Kocuria keratitis is underlined in the literature. However, the clinical features are variable, which may impede making a diagnosis. Here, we describe a first account of an initial misdiagnosis due to reliance on subjective analysis features on a confocal microscope, which ultimately led to a delay in commencing the correct treatment. In documenting this, we hope to underline to clinicians the difficulties in recognising a Kocuria Rhizophilia keratitis due to its similar clinical presentation to an Acanthamoeba Keratitis, thus emphasizing the need for early investigations such as corneal scrapes to secure the correct diagnosis and prevent further harm and vision loss for the patient.Keywords: keratitis, cornea, infection, rare, Kocuria
Procedia PDF Downloads 545389 An Application to Predict the Best Study Path for Information Technology Students in Learning Institutes
Authors: L. S. Chathurika
Abstract:
Early prediction of student performance is an important factor to be gained academic excellence. Whatever the study stream in secondary education, students lay the foundation for higher studies during the first year of their degree or diploma program in Sri Lanka. The information technology (IT) field has certain improvements in the education domain by selecting specialization areas to show the talents and skills of students. These specializations can be software engineering, network administration, database administration, multimedia design, etc. After completing the first-year, students attempt to select the best path by considering numerous factors. The purpose of this experiment is to predict the best study path using machine learning algorithms. Five classification algorithms: decision tree, support vector machine, artificial neural network, Naïve Bayes, and logistic regression are selected and tested. The support vector machine obtained the highest accuracy, 82.4%. Then affecting features are recognized to select the best study path.Keywords: algorithm, classification, evaluation, features, testing, training
Procedia PDF Downloads 1195388 Examining How Teachers’ Backgrounds and Perceptions for Technology Use Influence on Students’ Achievements
Authors: Zhidong Zhang, Amanda Resendez
Abstract:
This study is to examine how teachers’ perspective on education technology use in their class influence their students’ achievement. The authors hypothesized that teachers’ perspective can directly or indirectly influence students’ learning, performance, and achievements. In this study, a questionnaire entitled, Teacher’s Perspective on Educational Technology, was delivered to 63 teachers and 1268 students’ mathematics and reading achievement records were collected. The questionnaire consists of four parts: a) demographic variables, b) attitudes on technology integration, c) outside factor affecting technology integration, and d) technology use in the classroom. Kruskal-Wallis and hierarchical regression analysis techniques were used to examine: 1) the relationship between the demographic variables and teachers’ perspectives on educational technology, and 2) how the demographic variables were causally related to students’ mathematics and reading achievements. The study found that teacher demographics were significantly related to the teachers’ perspective on educational technology with p < 0.05 and p < 0.01 separately. These teacher demographical variables included the school district, age, gender, the grade currently teach, teaching experience, and proficiency using new technology. Further, these variables significantly predicted students’ mathematics and reading achievements with p < 0.05 and p < 0.01 separately. The variations of R² are between 0.176 and 0.467. That means 46.7% of the variance of a given analysis can be explained by the model.Keywords: teacher's perception of technology use, mathematics achievement, reading achievement, Kruskal-Wallis test, hierarchical regression analysis
Procedia PDF Downloads 1315387 Electrical Decomposition of Time Series of Power Consumption
Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats
Abstract:
Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).Keywords: electrical disaggregation, DTW, general appliance modeling, event detection
Procedia PDF Downloads 785386 Impact of Map Generalization in Spatial Analysis
Authors: Lin Li, P. G. R. N. I. Pussella
Abstract:
When representing spatial data and their attributes on different types of maps, the scale plays a key role in the process of map generalization. The process is consisted with two main operators such as selection and omission. Once some data were selected, they would undergo of several geometrical changing processes such as elimination, simplification, smoothing, exaggeration, displacement, aggregation and size reduction. As a result of these operations at different levels of data, the geometry of the spatial features such as length, sinuosity, orientation, perimeter and area would be altered. This would be worst in the case of preparation of small scale maps, since the cartographer has not enough space to represent all the features on the map. What the GIS users do is when they wanted to analyze a set of spatial data; they retrieve a data set and does the analysis part without considering very important characteristics such as the scale, the purpose of the map and the degree of generalization. Further, the GIS users use and compare different maps with different degrees of generalization. Sometimes, GIS users are going beyond the scale of the source map using zoom in facility and violate the basic cartographic rule 'it is not suitable to create a larger scale map using a smaller scale map'. In the study, the effect of map generalization for GIS analysis would be discussed as the main objective. It was used three digital maps with different scales such as 1:10000, 1:50000 and 1:250000 which were prepared by the Survey Department of Sri Lanka, the National Mapping Agency of Sri Lanka. It was used common features which were on above three maps and an overlay analysis was done by repeating the data with different combinations. Road data, River data and Land use data sets were used for the study. A simple model, to find the best place for a wild life park, was used to identify the effects. The results show remarkable effects on different degrees of generalization processes. It can see that different locations with different geometries were received as the outputs from this analysis. The study suggests that there should be reasonable methods to overcome this effect. It can be recommended that, as a solution, it would be very reasonable to take all the data sets into a common scale and do the analysis part.Keywords: generalization, GIS, scales, spatial analysis
Procedia PDF Downloads 3285385 Sustainable Investing and Corporate Performance: Evidence from Shariah Compliant Companies in Southeast Asia
Authors: Norashikin Ismail, Nadia Anridho
Abstract:
Sustainable investing is a responsible investment that focuses on Environmental, Social, and Governance (ESG) elements. ESG integration is essential in the investment process as it provides a positive contribution to the corporate performance for stakeholders, specifically investors. Sustainable investing is in line with the objectives of Shariah (Maqasid of Shariah), such as social inclusion as well as environmental preservation. This study attempts to evaluate the impact of ESG elements to the corporate financial performance among Shariah compliant stocks listed in two countries, namely Malaysia and Indonesia. The motivation of this study is to provide a further understanding in corporate sustainability for two different Islamic capital markets. The existence of the FTSE4Good Asean Index has played a vital role for ESG practices and eventually encouraged specific index for ESG and Shariah Compliant stocks. Our sample consists of 60 companies over the period 2010-2020 from two Southeast countries. We employ System Generalized Method of Moments (GMM) to reduce bias and more specific parameter estimation. Shariah Compliant companies tend to have higher ESG scores and are positively correlated to corporate financial performance. ESG integration with Shariah based investing would provide higher returns and lower risks for Muslim investors. Essentially, integrating ESG and Shariah, compliant companies lead to better financial performance.Keywords: shariah compliant, southeast asia, corporate performance, sustainable investing
Procedia PDF Downloads 1895384 Low-Cost Image Processing System for Evaluating Pavement Surface Distress
Authors: Keerti Kembhavi, M. R. Archana, V. Anjaneyappa
Abstract:
Most asphalt pavement condition evaluation use rating frameworks in which asphalt pavement distress is estimated by type, extent, and severity. Rating is carried out by the pavement condition rating (PCR), which is tedious and expensive. This paper presents the development of a low-cost technique for image pavement distress analysis that permits the identification of pothole and cracks. The paper explores the application of image processing tools for the detection of potholes and cracks. Longitudinal cracking and pothole are detected using Fuzzy-C- Means (FCM) and proceeded with the Spectral Theory algorithm. The framework comprises three phases, including image acquisition, processing, and extraction of features. A digital camera (Gopro) with the holder is used to capture pavement distress images on a moving vehicle. FCM classifier and Spectral Theory algorithms are used to compute features and classify the longitudinal cracking and pothole. The Matlab2016Ra Image preparing tool kit utilizes performance analysis to identify the viability of pavement distress on selected urban stretches of Bengaluru city, India. The outcomes of image evaluation with the utilization semi-computerized image handling framework represented the features of longitudinal crack and pothole with an accuracy of about 80%. Further, the detected images are validated with the actual dimensions, and it is seen that dimension variability is about 0.46. The linear regression model y=1.171x-0.155 is obtained using the existing and experimental / image processing area. The R2 correlation square obtained from the best fit line is 0.807, which is considered in the linear regression model to be ‘large positive linear association’.Keywords: crack detection, pothole detection, spectral clustering, fuzzy-c-means
Procedia PDF Downloads 1815383 Detection of Cardiac Arrhythmia Using Principal Component Analysis and Xgboost Model
Authors: Sujay Kotwale, Ramasubba Reddy M.
Abstract:
Electrocardiogram (ECG) is a non-invasive technique used to study and analyze various heart diseases. Cardiac arrhythmia is a serious heart disease which leads to death of the patients, when left untreated. An early-time detection of cardiac arrhythmia would help the doctors to do proper treatment of the heart. In the past, various algorithms and machine learning (ML) models were used to early-time detection of cardiac arrhythmia, but few of them have achieved better results. In order to improve the performance, this paper implements principal component analysis (PCA) along with XGBoost model. The PCA was implemented to the raw ECG signals which suppress redundancy information and extracted significant features. The obtained significant ECG features were fed into XGBoost model and the performance of the model was evaluated. In order to valid the proposed technique, raw ECG signals obtained from standard MIT-BIH database were employed for the analysis. The result shows that the performance of proposed method is superior to the several state-of-the-arts techniques.Keywords: cardiac arrhythmia, electrocardiogram, principal component analysis, XGBoost
Procedia PDF Downloads 1195382 A Predictive Machine Learning Model of the Survival of Female-led and Co-Led Small and Medium Enterprises in the UK
Authors: Mais Khader, Xingjie Wei
Abstract:
This research sheds light on female entrepreneurs by providing new insights on the survival predictions of companies led by females in the UK. This study aims to build a predictive machine learning model of the survival of female-led & co-led small & medium enterprises (SMEs) in the UK over the period 2000-2020. The predictive model built utilised a combination of financial and non-financial features related to both companies and their directors to predict SMEs' survival. These features were studied in terms of their contribution to the resultant predictive model. Five machine learning models are used in the modelling: Decision tree, AdaBoost, Naïve Bayes, Logistic regression and SVM. The AdaBoost model had the highest performance of the five models, with an accuracy of 73% and an AUC of 80%. The results show high feature importance in predicting companies' survival for company size, management experience, financial performance, industry, region, and females' percentage in management.Keywords: company survival, entrepreneurship, females, machine learning, SMEs
Procedia PDF Downloads 101