Search results for: data comparison
26170 An Analytical Metric and Process for Critical Infrastructure Architecture System Availability Determination in Distributed Computing Environments under Infrastructure Attack
Authors: Vincent Andrew Cappellano
Abstract:
In the early phases of critical infrastructure system design, translating distributed computing requirements to an architecture has risk given the multitude of approaches (e.g., cloud, edge, fog). In many systems, a single requirement for system uptime / availability is used to encompass the system’s intended operations. However, when architected systems may perform to those availability requirements only during normal operations and not during component failure, or during outages caused by adversary attacks on critical infrastructure (e.g., physical, cyber). System designers lack a structured method to evaluate availability requirements against candidate system architectures through deep degradation scenarios (i.e., normal ops all the way down to significant damage of communications or physical nodes). This increases risk of poor selection of a candidate architecture due to the absence of insight into true performance for systems that must operate as a piece of critical infrastructure. This research effort proposes a process to analyze critical infrastructure system availability requirements and a candidate set of systems architectures, producing a metric assessing these architectures over a spectrum of degradations to aid in selecting appropriate resilient architectures. To accomplish this effort, a set of simulation and evaluation efforts are undertaken that will process, in an automated way, a set of sample requirements into a set of potential architectures where system functions and capabilities are distributed across nodes. Nodes and links will have specific characteristics and based on sampled requirements, contribute to the overall system functionality, such that as they are impacted/degraded, the impacted functional availability of a system can be determined. A machine learning reinforcement-based agent will structurally impact the nodes, links, and characteristics (e.g., bandwidth, latency) of a given architecture to provide an assessment of system functional uptime/availability under these scenarios. By varying the intensity of the attack and related aspects, we can create a structured method of evaluating the performance of candidate architectures against each other to create a metric rating its resilience to these attack types/strategies. Through multiple simulation iterations, sufficient data will exist to compare this availability metric, and an architectural recommendation against the baseline requirements, in comparison to existing multi-factor computing architectural selection processes. It is intended that this additional data will create an improvement in the matching of resilient critical infrastructure system requirements to the correct architectures and implementations that will support improved operation during times of system degradation due to failures and infrastructure attacks.Keywords: architecture, resiliency, availability, cyber-attack
Procedia PDF Downloads 10926169 Mathematics Bridging Theory and Applications for a Data-Driven World
Authors: Zahid Ullah, Atlas Khan
Abstract:
In today's data-driven world, the role of mathematics in bridging the gap between theory and applications is becoming increasingly vital. This abstract highlights the significance of mathematics as a powerful tool for analyzing, interpreting, and extracting meaningful insights from vast amounts of data. By integrating mathematical principles with real-world applications, researchers can unlock the full potential of data-driven decision-making processes. This abstract delves into the various ways mathematics acts as a bridge connecting theoretical frameworks to practical applications. It explores the utilization of mathematical models, algorithms, and statistical techniques to uncover hidden patterns, trends, and correlations within complex datasets. Furthermore, it investigates the role of mathematics in enhancing predictive modeling, optimization, and risk assessment methodologies for improved decision-making in diverse fields such as finance, healthcare, engineering, and social sciences. The abstract also emphasizes the need for interdisciplinary collaboration between mathematicians, statisticians, computer scientists, and domain experts to tackle the challenges posed by the data-driven landscape. By fostering synergies between these disciplines, novel approaches can be developed to address complex problems and make data-driven insights accessible and actionable. Moreover, this abstract underscores the importance of robust mathematical foundations for ensuring the reliability and validity of data analysis. Rigorous mathematical frameworks not only provide a solid basis for understanding and interpreting results but also contribute to the development of innovative methodologies and techniques. In summary, this abstract advocates for the pivotal role of mathematics in bridging theory and applications in a data-driven world. By harnessing mathematical principles, researchers can unlock the transformative potential of data analysis, paving the way for evidence-based decision-making, optimized processes, and innovative solutions to the challenges of our rapidly evolving society.Keywords: mathematics, bridging theory and applications, data-driven world, mathematical models
Procedia PDF Downloads 7526168 AI-Enabled Smart Contracts for Reliable Traceability in the Industry 4.0
Authors: Harris Niavis, Dimitra Politaki
Abstract:
The manufacturing industry was collecting vast amounts of data for monitoring product quality thanks to the advances in the ICT sector and dedicated IoT infrastructure is deployed to track and trace the production line. However, industries have not yet managed to unleash the full potential of these data due to defective data collection methods and untrusted data storage and sharing. Blockchain is gaining increasing ground as a key technology enabler for Industry 4.0 and the smart manufacturing domain, as it enables the secure storage and exchange of data between stakeholders. On the other hand, AI techniques are more and more used to detect anomalies in batch and time-series data that enable the identification of unusual behaviors. The proposed scheme is based on smart contracts to enable automation and transparency in the data exchange, coupled with anomaly detection algorithms to enable reliable data ingestion in the system. Before sensor measurements are fed to the blockchain component and the smart contracts, the anomaly detection mechanism uniquely combines artificial intelligence models to effectively detect unusual values such as outliers and extreme deviations in data coming from them. Specifically, Autoregressive integrated moving average, Long short-term memory (LSTM) and Dense-based autoencoders, as well as Generative adversarial networks (GAN) models, are used to detect both point and collective anomalies. Towards the goal of preserving the privacy of industries' information, the smart contracts employ techniques to ensure that only anonymized pointers to the actual data are stored on the ledger while sensitive information remains off-chain. In the same spirit, blockchain technology guarantees the security of the data storage through strong cryptography as well as the integrity of the data through the decentralization of the network and the execution of the smart contracts by the majority of the blockchain network actors. The blockchain component of the Data Traceability Software is based on the Hyperledger Fabric framework, which lays the ground for the deployment of smart contracts and APIs to expose the functionality to the end-users. The results of this work demonstrate that such a system can increase the quality of the end-products and the trustworthiness of the monitoring process in the smart manufacturing domain. The proposed AI-enabled data traceability software can be employed by industries to accurately trace and verify records about quality through the entire production chain and take advantage of the multitude of monitoring records in their databases.Keywords: blockchain, data quality, industry4.0, product quality
Procedia PDF Downloads 18926167 Comparative Analysis of Photovoltaic Systems
Authors: Irtaza M. Syed, Kaameran Raahemifar
Abstract:
This paper presents comparative analysis of photovoltaic systems (PVS) and proposes practical techniques to improve operational efficiency of the PVS. The best engineering and construction practices for PVS are identified and field oriented recommendation are made. Comparative analysis of central and string inverter based, as well as 600 and 1000 VDC PVS are performed. In addition, direct current (DC) and alternating current (AC) photovoltaic (PV) module based systems are compared. Comparison shows that 1000 V DC String Inverters based PVS is the best choice.Keywords: photovoltaic module, photovoltaic systems, operational efficiency improvement, comparative analysis
Procedia PDF Downloads 48526166 Unstructured-Data Content Search Based on Optimized EEG Signal Processing and Multi-Objective Feature Extraction
Authors: Qais M. Yousef, Yasmeen A. Alshaer
Abstract:
Over the last few years, the amount of data available on the globe has been increased rapidly. This came up with the emergence of recent concepts, such as the big data and the Internet of Things, which have furnished a suitable solution for the availability of data all over the world. However, managing this massive amount of data remains a challenge due to their large verity of types and distribution. Therefore, locating the required file particularly from the first trial turned to be a not easy task, due to the large similarities of names for different files distributed on the web. Consequently, the accuracy and speed of search have been negatively affected. This work presents a method using Electroencephalography signals to locate the files based on their contents. Giving the concept of natural mind waves processing, this work analyses the mind wave signals of different people, analyzing them and extracting their most appropriate features using multi-objective metaheuristic algorithm, and then classifying them using artificial neural network to distinguish among files with similar names. The aim of this work is to provide the ability to find the files based on their contents using human thoughts only. Implementing this approach and testing it on real people proved its ability to find the desired files accurately within noticeably shorter time and retrieve them as a first choice for the user.Keywords: artificial intelligence, data contents search, human active memory, mind wave, multi-objective optimization
Procedia PDF Downloads 17726165 Numerical Assessment of Fire Characteristics with Bodies Engulfed in Hydrocarbon Pool Fire
Authors: Siva Kumar Bathina, Sudheer Siddapureddy
Abstract:
Fires accident becomes even worse when the hazardous equipment like reactors or radioactive waste packages are engulfed in fire. In this work, large-eddy numerical fire simulations are performed using fire dynamic simulator to predict the thermal behavior of such bodies engulfed in hydrocarbon pool fires. A radiatively dominated 0.3 m circular burner with n-heptane as the fuel is considered in this work. The fire numerical simulation results without anybody inside the fire are validated with the reported experimental data. The comparison is in good agreement for different flame properties like predicted mass burning rate, flame height, time-averaged center-line temperature, time-averaged center-line velocity, puffing frequency, the irradiance at the surroundings, and the radiative heat feedback to the pool surface. Cask of different sizes is simulated with SS304L material. The results are independent of the material of the cask simulated as the adiabatic surface temperature concept is employed in this study. It is observed that the mass burning rate increases with the blockage ratio (3% ≤ B ≤ 32%). However, the change in this increment is reduced at higher blockage ratios (B > 14%). This is because the radiative heat feedback to the fuel surface is not only from the flame but also from the cask volume. As B increases, the volume of the cask increases and thereby increases the radiative contribution to the fuel surface. The radiative heat feedback in the case of the cask engulfed in the fire is increased by 2.5% to 31% compared to the fire without cask.Keywords: adiabatic surface temperature, fire accidents, fire dynamic simulator, radiative heat feedback
Procedia PDF Downloads 12726164 IoT Based Approach to Healthcare System for a Quadriplegic Patient Using EEG
Authors: R. Gautam, P. Sastha Kanagasabai, G. N. Rathna
Abstract:
The proposed healthcare system enables quadriplegic patients, people with severe motor disabilities to send commands to electronic devices and monitor their vitals. The growth of Brain-Computer-Interface (BCI) has led to rapid development in 'assistive systems' for the disabled called 'assistive domotics'. Brain-Computer-Interface is capable of reading the brainwaves of an individual and analyse it to obtain some meaningful data. This processed data can be used to assist people having speech disorders and sometimes people with limited locomotion to communicate. In this Project, Emotiv EPOC Headset is used to obtain the electroencephalogram (EEG). The obtained data is processed to communicate pre-defined commands over the internet to the desired mobile phone user. Other Vital Information like the heartbeat, blood pressure, ECG and body temperature are monitored and uploaded to the server. Data analytics enables physicians to scan databases for a specific illness. The Data is processed in Intel Edison, system on chip (SoC). Patient metrics are displayed via Intel IoT Analytics cloud service.Keywords: brain computer interface, Intel Edison, Emotiv EPOC, IoT analytics, electroencephalogram
Procedia PDF Downloads 18626163 Searchable Encryption in Cloud Storage
Authors: Ren Junn Hwang, Chung-Chien Lu, Jain-Shing Wu
Abstract:
Cloud outsource storage is one of important services in cloud computing. Cloud users upload data to cloud servers to reduce the cost of managing data and maintaining hardware and software. To ensure data confidentiality, users can encrypt their files before uploading them to a cloud system. However, retrieving the target file from the encrypted files exactly is difficult for cloud server. This study proposes a protocol for performing multikeyword searches for encrypted cloud data by applying k-nearest neighbor technology. The protocol ranks the relevance scores of encrypted files and keywords, and prevents cloud servers from learning search keywords submitted by a cloud user. To reduce the costs of file transfer communication, the cloud server returns encrypted files in order of relevance. Moreover, when a cloud user inputs an incorrect keyword and the number of wrong alphabet does not exceed a given threshold; the user still can retrieve the target files from cloud server. In addition, the proposed scheme satisfies security requirements for outsourced data storage.Keywords: fault-tolerance search, multi-keywords search, outsource storage, ranked search, searchable encryption
Procedia PDF Downloads 38326162 A Bivariate Inverse Generalized Exponential Distribution and Its Applications in Dependent Competing Risks Model
Authors: Fatemah A. Alqallaf, Debasis Kundu
Abstract:
The aim of this paper is to introduce a bivariate inverse generalized exponential distribution which has a singular component. The proposed bivariate distribution can be used when the marginals have heavy-tailed distributions, and they have non-monotone hazard functions. Due to the presence of the singular component, it can be used quite effectively when there are ties in the data. Since it has four parameters, it is a very flexible bivariate distribution, and it can be used quite effectively for analyzing various bivariate data sets. Several dependency properties and dependency measures have been obtained. The maximum likelihood estimators cannot be obtained in closed form, and it involves solving a four-dimensional optimization problem. To avoid that, we have proposed to use an EM algorithm, and it involves solving only one non-linear equation at each `E'-step. Hence, the implementation of the proposed EM algorithm is very straight forward in practice. Extensive simulation experiments and the analysis of one data set have been performed. We have observed that the proposed bivariate inverse generalized exponential distribution can be used for modeling dependent competing risks data. One data set has been analyzed to show the effectiveness of the proposed model.Keywords: Block and Basu bivariate distributions, competing risks, EM algorithm, Marshall-Olkin bivariate exponential distribution, maximum likelihood estimators
Procedia PDF Downloads 14326161 Risk Assessment and Haloacetic Acids Exposure in Drinking Water in Tunja, Colombia
Authors: Bibiana Matilde Bernal Gómez, Manuel Salvador Rodríguez Susa, Mildred Fernanda Lemus Perez
Abstract:
In chlorinated drinking water, Haloacetic acids have been identified and are classified as disinfection byproducts originating from reaction between natural organic matter and/or bromide ions in water sources. These byproducts can be generated through a variety of chemical and pharmaceutical processes. The term ‘Total Haloacetic Acids’ (THAAs) is used to describe the cumulative concentration of dichloroacetic acid, trichloroacetic acid, monochloroacetic acid, monobromoacetic acid, and dibromoacetic acid in water samples, which are usually measured to evaluate water quality. Chronic presence of these acids in drinking water has a risk of cancer in humans. The detection of THAAs for the first time in 15 municipalities of Boyacá was accomplished in 2023. Aim is to describe the correlation between the levels of THAAs and digestive cancer in Tunja, a city in Colombia with higher rates of digestive cancer and to compare the risk across 15 towns, taking into account factors such as water quality. A research project was conducted with the aim of comparing water sources based on the geographical features of the town, describing the disinfection process in 15 municipalities, and exploring physical properties such as water temperature and pH level. The project also involved a study of contact time based on habits documented through a survey, and a comparison of socioeconomic factors and lifestyle, in order to assess the personal risk of exposure. Data on the levels of THAAs were obtained after characterizing the water quality in urban sectors in eight months of 2022. This, based on the protocol described in the Stage 2 DBP of the United States Environmental Protection Agency (USEPA) from 2006, which takes into account the size of the population being supplied. A cancer risk assessment was conducted to evaluate the likelihood of an individual developing cancer due to exposure to pollutants THAAs. The assessment considered exposure methods like oral ingestion, skin absorption, and inhalation. The chronic daily intake (CDI) for these exposure routes was calculated using specific equations. The lifetime cancer risk (LCR) was then determined by adding the cancer risks from the three exposure routes for each HAA. The risk assessment process involved four phases: exposure assessment, toxicity evaluation, data gathering and analysis, and risk definition and management. The results conclude that there is a cumulative higher risk of digestive cancer due to THAAs exposure in drinking water.Keywords: haloacetic acids, drinking water, water quality, cancer risk assessment
Procedia PDF Downloads 5826160 Comparison of Punicic Acid Amounts in Abdominal Fat Farm Feeding Hy-Line Chickens
Authors: Ozcan Baris Citil, Mehmet Akoz
Abstract:
Effects of fatty acid composition and punicic acid contents of abdominal fat of Hy-line hens were investigated by the gas chromatographic method. Total 30 different fatty acids were determined in fatty acid compositions of eggs. These fatty acids were varied between C 8 to C 22. The punicic acid content of abdominal fats analysed was found to be higher percentages in the 90th day than those of 30th and 60th day. At the end of the experiment, total punicic acid contents of abdominal fats were significantly increased.Keywords: fatty acids, gas chromatography, punicic acid, abdominal fats
Procedia PDF Downloads 34726159 Analyzing How Working From Home Can Lead to Higher Job Satisfaction for Employees Who Have Care Responsibilities Using Structural Equation Modeling
Authors: Christian Louis Kühner, Florian Pfeffel, Valentin Nickolai
Abstract:
Taking care of children, dependents, or pets can be a difficult and time-consuming task. Especially for part- and full-time employees, it can feel exhausting and overwhelming to meet these obligations besides working a job. Thus, working mostly at home and not having to drive to the company can save valuable time and stress. This study aims to show the influence that the working model has on the job satisfaction of employees with care responsibilities in comparison to employees who do not have such obligations. Using structural equation modeling (SEM), the three work models, “work from home”, “working remotely”, and a hybrid model, have been analyzed based on 13 influencing constructs on job satisfaction. These 13 factors have been further summarized into three groups “classic influencing factors”, “influencing factors changed by remote working”, and “new remote working influencing factors”. Based on the influencing factors on job satisfaction, an online survey was conducted with n = 684 employees from the service sector. Here, Cronbach’s alpha of the individual constructs was shown to be suitable. Furthermore, the construct validity of the constructs was confirmed by face validity, content validity, convergent validity (AVE > 0.5: CR > 0.7), and discriminant validity. In addition, confirmatory factor analysis (CFA) confirmed the model fit for the investigated sample (CMIN/DF: 2.567; CFI: 0.927; RMSEA: 0.048). The SEM-analysis has shown that the most significant influencing factor on job satisfaction is “identification with the work” with β = 0.540, followed by “Appreciation” (β = 0.151), “Compensation” (β = 0.124), “Work-Life-Balance” (β = 0.116), and “Communication and Exchange of Information” (β = 0.105). While the significance of each factor can vary depending on the work model, the SEM-analysis shows that the identification with the work is the most significant factor in all three work models and, in the case of the traditional office work model, it is the only significant influencing factor. The study shows that among the employees with care responsibilities, the higher the proportion of working from home in comparison to working from the office, the more satisfied the employees are with their job. Since the work models that meet the requirements of comprehensive care led to higher job satisfaction amongst employees with such obligations, adapting as a company to such private obligations by employees can be crucial to sustained success. Conversely, the satisfaction level of the working model where employees work at the office is higher for workers without caregiving responsibilities.Keywords: care responsibilities, home office, job satisfaction, structural equation modeling
Procedia PDF Downloads 8326158 Blind Data Hiding Technique Using Interpolation of Subsampled Images
Authors: Singara Singh Kasana, Pankaj Garg
Abstract:
In this paper, a blind data hiding technique based on interpolation of sub sampled versions of a cover image is proposed. Sub sampled image is taken as a reference image and an interpolated image is generated from this reference image. Then difference between original cover image and interpolated image is used to embed secret data. Comparisons with the existing interpolation based techniques show that proposed technique provides higher embedding capacity and better visual quality marked images. Moreover, the performance of the proposed technique is more stable for different images.Keywords: interpolation, image subsampling, PSNR, SIM
Procedia PDF Downloads 57826157 Mutational and Evolutionary Analysis of Interleukin-2 Gene in Four Pakistani Goat Breeds
Authors: Tanveer Hussain, Misbah Hussain, Masroor Ellahi Babar, Muhammad Traiq Pervez, Fiaz Hussain, Sana Zahoor, Rashid Saif
Abstract:
Interleukin 2 (IL-2) is a cytokine which is produced by activated T cells, play important role in immune response against antigen. It act in both autocrine and paracrine manner. It can stimulate B cells and various other phagocytic cells like monocytes, lymphokine-activated killer cells and natural killer cells. Acting in autocrine fashion, IL-2 protein plays a crucial role in proliferation of T cells. IL-2 triggers the release of pro and anti- inflammatory cytokines by activating several pathways. In present study, exon 1 of IL-2 gene of four local Pakistani breeds (Dera Din Panah, Beetal, Nachi and Kamori) from two provinces was amplified by using reported Ovine IL-2 primers, yielding PCR product of 501 bp. The sequencing of all samples was done to identify the polymorphisms in amplified region of IL-2 gene. Analysis of sequencing data resulted in identification of one novel nucleotide substitution (T→A) in amplified non-coding region of IL-2 gene. Comparison of IL-2 gene sequence of all four breeds with other goat breeds showed high similarity in sequence. While phylogenetic analysis of our local breeds with other mammals showed that IL-2 is a variable gene which has undergone many substitutions. This high substitution rate can be due to the decreased or increased changed selective pressure. These rapid changes can also lead to the change in function of immune system. This pioneering study of Pakistani goat breeds urge for further studies on immune system of each targeted breed for fully understanding the functional role of IL-2 in goat immunity.Keywords: interleukin 2, mutational analysis, phylogeny, goat breeds, Pakistan
Procedia PDF Downloads 61126156 Arabic Handwriting Recognition Using Local Approach
Authors: Mohammed Arif, Abdessalam Kifouche
Abstract:
Optical character recognition (OCR) has a main role in the present time. It's capable to solve many serious problems and simplify human activities. The OCR yields to 70's, since many solutions has been proposed, but unfortunately, it was supportive to nothing but Latin languages. This work proposes a system of recognition of an off-line Arabic handwriting. This system is based on a structural segmentation method and uses support vector machines (SVM) in the classification phase. We have presented a state of art of the characters segmentation methods, after that a view of the OCR area, also we will address the normalization problems we went through. After a comparison between the Arabic handwritten characters & the segmentation methods, we had introduced a contribution through a segmentation algorithm.Keywords: OCR, segmentation, Arabic characters, PAW, post-processing, SVM
Procedia PDF Downloads 7226155 Assessing and Identifying Factors Affecting Customers Satisfaction of Commercial Bank of Ethiopia: The Case of West Shoa Zone (Bako, Gedo, Ambo, Ginchi and Holeta), Ethiopia
Authors: Habte Tadesse Likassa, Bacha Edosa
Abstract:
Customer’s satisfaction was very important thing that is required for the existence of banks to be more productive and success in any organization and business area. The main goal of the study is assessing and identifying factors that influence customer’s satisfaction in West Shoa Zone of Commercial Bank of Ethiopia (Holeta, Ginchi, Ambo, Gedo and Bako). Stratified random sampling procedure was used in the study and by using simple random sampling (lottery method) 520 customers were drawn from the target population. By using Probability Proportional Size Techniques sample size for each branch of banks were allocated. Both descriptive and inferential statistics methods were used in the study. A binary logistic regression model was fitted to see the significance of factors affecting customer’s satisfaction in this study. SPSS statistical package was used for data analysis. The result of the study reveals that the overall level of customer’s satisfaction in the study area is low (38.85%) as compared those who were not satisfied (61.15%). The result of study showed that all most all factors included in the study were significantly associated with customer’s satisfaction. Therefore, it can be concluded that based on the comparison of branches on their customers satisfaction by using odd ratio customers who were using Ambo and Bako are less satisfied as compared to customers who were in Holeta branch. Additionally, customers who were in Ginchi and Gedo were more satisfied than that of customers who were in Holeta. Since the level of customers satisfaction was low in the study area, it is more advisable and recommended for concerned body works cooperatively more in maximizing satisfaction of their customers.Keywords: customers, satisfaction, binary logistic, complain handling process, waiting time
Procedia PDF Downloads 46526154 Study of Corrosion in Structures due to Chloride Infiltration
Authors: Sukrit Ghorai, Akku Aby Mathews
Abstract:
Corrosion in reinforcing steel is the leading cause for deterioration in concrete structures. It is an electrochemical process which leads to volumetric change in concrete and causes cracking, delamination and spalling. The objective of the study is to provide a rational method to estimate the probable chloride concentration at the reinforcement level for a known surface chloride concentration. The paper derives the formulation of design charts to aid engineers for quick calculation of the chloride concentration. Furthermore, the paper focuses on comparison of durability design against corrosion with American, European and Indian design standards.Keywords: chloride infiltration, concrete, corrosion, design charts
Procedia PDF Downloads 41126153 Active Contours for Image Segmentation Based on Complex Domain Approach
Authors: Sajid Hussain
Abstract:
The complex domain approach for image segmentation based on active contour has been designed, which deforms step by step to partition an image into numerous expedient regions. A novel region-based trigonometric complex pressure force function is proposed, which propagates around the region of interest using image forces. The signed trigonometric force function controls the propagation of the active contour and the active contour stops on the exact edges of the object accurately. The proposed model makes the level set function binary and uses Gaussian smoothing kernel to adjust and escape the re-initialization procedure. The working principle of the proposed model is as follows: The real image data is transformed into complex data by iota (i) times of image data and the average iota (i) times of horizontal and vertical components of the gradient of image data is inserted in the proposed model to catch complex gradient of the image data. A simple finite difference mathematical technique has been used to implement the proposed model. The efficiency and robustness of the proposed model have been verified and compared with other state-of-the-art models.Keywords: image segmentation, active contour, level set, Mumford and Shah model
Procedia PDF Downloads 11426152 Teachers’ Perceptions Related to the Guiding Skills within the Application Courses
Authors: Tanimola Kazeem Abiodun
Abstract:
In Nigeria, both formal education and distance learning opportunities are used in teacher training. Practical courses aim to improve the skills of teacher candidates in a school environment. Teacher candidates attend kindergarten classes under the supervision of a teacher. In this context, the guiding skills of teachers gain importance in terms of shaping candidates’ perceptions about teaching profession. In this study, the teachers’ perceptions related to the guiding skills within the practical courses were determined. Also, the perceptions and applications related to guiding skills were compared. A Likert scale questionnaire and an open-ended question were used to determine perceptions and applications. 120 questionnaires were taken into consideration and analyses of data were performed by using percentage distribution and QSR Nvivo 8 program. In this study, statements related to teachers’ perceptions about the guiding skills were asked and it is determined that almost all the teachers agreed about the importance of these statements. On the other hand, how these guidance skills are applied by teachers is also queried with an open-ended question. Finally, thoughts and applications related to guidance skills were compared to each other. Based on this comparison, it is seen that there are some differences between the thoughts and applications especially related with time management, planning, feedbacks, curriculum, workload, rules and guidance. It can be said that some guidance skills cannot be controlled only by teachers. For example, candidates’ motivation, attention, population and educational environment are also determinative factors for effective guidance. In summary, it is necessary to have prior conditions for teachers to apply these idealized guidance skills for training more successful candidates to pre-school education era. At this point, organization of practical courses by the faculties gains importance and in this context it is crucial for faculties to revise their applications based on more detailed researches.Keywords: teacher training, guiding skills, education, practical courses
Procedia PDF Downloads 44726151 Survival Analysis of Identifying the Risk Factors of Affecting the First Recurrence Time of Breast Cancer: The Case of Tigray, Ethiopia
Authors: Segen Asayehegn
Abstract:
Introduction: In Tigray, Ethiopia, next to cervical cancer, breast cancer is one of the most common cancer health problems for women. Objectives: This article is proposed to identify the prospective and potential risk factors affecting the time-to-first-recurrence of breast cancer patients in Tigray, Ethiopia. Methods: The data were taken from the patient’s medical record that registered from January 2010 to January 2020. The study considered a sample size of 1842 breast cancer patients. Powerful non-parametric and parametric shared frailty survival regression models (FSRM) were applied, and model comparisons were performed. Results: Out of 1842 breast cancer patients, about 1290 (70.02%) recovered/cured the disease. The median cure time from breast cancer is found at 12.8 months. The model comparison suggested that the lognormal parametric shared a frailty survival regression model predicted that treatment, stage of breast cancer, smoking habit, and marital status significantly affects the first recurrence of breast cancer. Conclusion: Factors like treatment, stages of cancer, and marital status were improved while smoking habits worsened the time to cure breast cancer. Recommendation: Thus, the authors recommend reducing breast cancer health problems, the regional health sector facilities need to be improved. More importantly, concerned bodies and medical doctors should emphasize the identified factors during treatment. Furthermore, general awareness programs should be given to the community on the identified factors.Keywords: acceleration factor, breast cancer, Ethiopia, shared frailty survival models, Tigray
Procedia PDF Downloads 13526150 Adaptive Motion Compensated Spatial Temporal Filter of Colonoscopy Video
Authors: Nidhal Azawi
Abstract:
Colonoscopy procedure is widely used in the world to detect an abnormality. Early diagnosis can help to heal many patients. Because of the unavoidable artifacts that exist in colon images, doctors cannot detect a colon surface precisely. The purpose of this work is to improve the visual quality of colonoscopy videos to provide better information for physicians by removing some artifacts. This work complements a series of work consisting of three previously published papers. In this paper, Optic flow is used for motion compensation, and then consecutive images are aligned/registered to integrate some information to create a new image that has or reveals more information than the original one. Colon images have been classified into informative and noninformative images by using a deep neural network. Then, two different strategies were used to treat informative and noninformative images. Informative images were treated by using Lucas Kanade (LK) with an adaptive temporal mean/median filter, whereas noninformative images are treated by using Lucas Kanade with a derivative of Gaussian (LKDOG) with adaptive temporal median images. A comparison result showed that this work achieved better results than that results in the state- of- the- art strategies for the same degraded colon images data set, which consists of 1000 images. The new proposed algorithm reduced the error alignment by about a factor of 0.3 with a 100% successfully image alignment ratio. In conclusion, this algorithm achieved better results than the state-of-the-art approaches in case of enhancing the informative images as shown in the results section; also, it succeeded to convert the non-informative images that have very few details/no details because of the blurriness/out of focus or because of the specular highlight dominate significant amount of an image to informative images.Keywords: optic flow, colonoscopy, artifacts, spatial temporal filter
Procedia PDF Downloads 11326149 Comparison of Cardiometabolic Risk Factors in Lean Versus Overweight/Obese Peri-Urban Female Adolescent School Learners in Mthatha, South Africa: A Pilot Case Control Study
Authors: Benedicta N. Nkeh-Chungag, Constance R. Sewani-Rusike, Isaac M. Malema, Daniel T. Goon, Oladele V. Adeniyi, Idowu A. Ajayi
Abstract:
Background: Childhood and adolescent obesity is an important predictor of adult cardiometabolic diseases. Current data on age- and gender-specific cardiometabolic risk factors are lacking in the peri-urban Eastern Cape Province, South Africa. However, such information is important in designing innovative strategies to promote healthy living among children and adolescents. The purpose of this pilot study was to compare and determine the extent of cardiometabolic risk factors between samples of lean and overweight/obese adolescent population in a peri-urban township of South Africa. Methods: In this case-control study, age-matched, non-pregnant and non-lactating female adolescents consisting of equal number of cases (50 overweight/obese) and control (50 lean) participated in the study. Fasting venous blood samples were obtained for total cholesterol (TC), low-density lipoprotein cholesterol (LDL-C), high-density lipoprotein cholesterol (HDL-C), triglyceride (Trig), highly sensitive C-reactive protein (hsCRP) and blood sugar. Anthropometric measurements included weight, height, waist and hip circumferences. Body mass index was calculated. Blood pressure was measured; and metabolic syndrome was assessed using appropriate diagnostic criteria for children and adolescents. Results: Of the 76 participants with complete data, 12/38 of the overweight/obese and 1/38 of the lean group met the criteria for adolescent metabolic syndrome. All cardiometabolic risk factors were elevated in the overweight/obese group compared with the lean group: low HDL-C (RR = 2.21), elevated TC (RR = 1.23), elevated LDL-C (RR = 1.42), elevated Trig (RR = 1.73), and elevated hsCRP (RR = 1.9). There were significant atherosclerotic indices among the overweight/obese group compared with the lean group: TC/HDL and LDL/HDL (2.99±0.91 vs 2.63±0.48; p=0.016 and 1.73±0.61 vs 1.41±0.46; p= 0.014, respectively). Conclusion: There are multiple cardiometabolic risk factors among the overweight/obese female adolescent group compared with lean adolescent group in the study. Female adolescent who are overweight and obese have higher relative risks of developing cardiometabolic diseases compared with their lean counterparts in the peri-urban Mthatha, South Africa. School health programme focusing on promoting physical exercise, healthy eating and keeping appropriate weight are needed in the country.Keywords: adolescents, cardiometabolic risk factors, obesity, peri-urban South Africa
Procedia PDF Downloads 47426148 Flexural Properties of Carbon/Polypropylene Composites: Influence of Matrix Forming Polypropylene in Fiber, Powder, and Film States
Authors: Vijay Goud, Ramasamy Alagirusamy, Apurba Das, Dinesh Kalyanasundaram
Abstract:
Thermoplastic composites render new opportunities as effective processing technology while crafting newer complications into processing. One of the notable challenges is in achieving thorough wettability that is significantly deterred by the high viscosity of the long molecular chains of the thermoplastics. As a result of high viscosity, it is very difficult to impregnate the resin into a tightly interlaced textile structure to fill the voids present in the structure. One potential solution to the above problem, is to pre-deposit resin on the fiber, prior to consolidation. The current study compares DREF spinning, powder coating and film stacking methods of predeposition of resin onto fibers. An investigation into the flexural properties of unidirectional composites (UDC) produced from blending of carbon fiber and polypropylene (PP) matrix in varying forms of fiber, powder and film are reported. Dr. Ernst Fehrer (DREF) yarns or friction spun hybrid yarns were manufactured from PP fibers and carbon tows. The DREF yarns were consolidated to yield unidirectional composites (UDCs) referred to as UDC-D. PP in the form of powder was coated on carbon tows by electrostatic spray coating. The powder-coated towpregs were consolidated to form UDC-P. For the sake of comparison, a third UDC referred as UDC-F was manufactured by the consolidation of PP films stacked between carbon tows. The experiments were designed to yield a matching fiber volume fraction of about 50 % in all the three UDCs. A comparison of mechanical properties of the three composites was studied to understand the efficiency of matrix wetting and impregnation. Approximately 19% and 68% higher flexural strength were obtained for UDC-P than UDC-D and UDC-F respectively. Similarly, 25% and 81% higher modulus were observed in UDC-P than UDC-D and UDC-F respectively. Results from micro-computed tomography, scanning electron microscopy, and short beam tests indicate better impregnation of PP matrix in UDC-P obtained through electrostatic spray coating process and thereby higher flexural strength and modulus.Keywords: DREF spinning, film stacking, flexural strength, powder coating, thermoplastic composite
Procedia PDF Downloads 22226147 Understanding the Impact of Spatial Light Distribution on Object Identification in Low Vision: A Pilot Psychophysical Study
Authors: Alexandre Faure, Yoko Mizokami, éRic Dinet
Abstract:
These recent years, the potential of light in assisting visually impaired people in their indoor mobility has been demonstrated by different studies. Implementing smart lighting systems for selective visual enhancement, especially designed for low-vision people, is an approach that breaks with the existing visual aids. The appearance of the surface of an object is significantly influenced by the lighting conditions and the constituent materials of the objects. Appearance of objects may appear to be different from expectation. Therefore, lighting conditions lead to an important part of accurate material recognition. The main objective of this work was to investigate the effect of the spatial distribution of light on object identification in the context of low vision. The purpose was to determine whether and what specific lighting approaches should be preferred for visually impaired people. A psychophysical experiment was designed to study the ability of individuals to identify the smallest cube of a pair under different lighting diffusion conditions. Participants were divided into two distinct groups: a reference group of observers with normal or corrected-to-normal visual acuity and a test group, in which observers were required to wear visual impairment simulation glasses. All participants were presented with pairs of cubes in a "miniature room" and were instructed to estimate the relative size of the two cubes. The miniature room replicates real-life settings, adorned with decorations and separated from external light sources by black curtains. The correlated color temperature was set to 6000 K, and the horizontal illuminance at the object level at approximately 240 lux. The objects presented for comparison consisted of 11 white cubes and 11 black cubes of different sizes manufactured with a 3D printer. Participants were seated 60 cm away from the objects. Two different levels of light diffuseness were implemented. After receiving instructions, participants were asked to judge whether the two presented cubes were the same size or if one was smaller. They provided one of five possible answers: "Left one is smaller," "Left one is smaller but unsure," "Same size," "Right one is smaller," or "Right one is smaller but unsure.". The method of constant stimuli was used, presenting stimulus pairs in a random order to prevent learning and expectation biases. Each pair consisted of a comparison stimulus and a reference cube. A psychometric function was constructed to link stimulus value with the frequency of correct detection, aiming to determine the 50% correct detection threshold. Collected data were analyzed through graphs illustrating participants' responses to stimuli, with accuracy increasing as the size difference between cubes grew. Statistical analyses, including 2-way ANOVA tests, showed that light diffuseness had no significant impact on the difference threshold, whereas object color had a significant influence in low vision scenarios. The first results and trends derived from this pilot experiment clearly and strongly suggest that future investigations could explore extreme diffusion conditions to comprehensively assess the impact of diffusion on object identification. For example, the first findings related to light diffuseness may be attributed to the range of manipulation, emphasizing the need to explore how other lighting-related factors interact with diffuseness.Keywords: Lighting, Low Vision, Visual Aid, Object Identification, Psychophysical Experiment
Procedia PDF Downloads 6426146 Development of an Integrated Criminogenic Intervention Programme for High Risk Offenders
Authors: Yunfan Jiang
Abstract:
In response to an identified gap in available treatment programmes for high-risk offenders with multiple criminogenic needs and guided by emerging literature in the field of correctional rehabilitation, Singapore Prison Service (SPS) developed the Integrated Criminogenic Programme (ICP) in 2012. This evidence-informed psychological programme was designed to address all seven dynamic criminogenic needs (from the Central 8) of high-risk offenders by applying concepts from rehabilitation and psychological theories such as Risk-Need-Responsivity, Good Lives Model, narrative identity, and motivational interviewing. This programme also encompasses a 6-month community maintenance component for the purpose of providing structured step-down support in the aftercare setting. These sessions provide participants the opportunity for knowledge reinforcement and application of skills attained in-care. A quantitative evaluation of the ICP showed that the intervention group had statistically significant improvements across time in most self-report measures of criminal attitudes, substance use attitudes, and psychosocial functioning. This was congruent with qualitative data from participants saying that the ICP had the most impact on their criminal thinking patterns and management of behaviours in high-risk situations. Results from the comparison group showed no difference in their criminal attitudes, even though they reported statistically significant improvements across time in their substance use attitudes and some self-report measures of psychosocial functioning. The programme’s efficacy was also apparent in the lower rates of recidivism and relapse within 12 months for the intervention group. The management of staff issues arising from the development and implementation of an innovative high-intensity psychological programme such as the ICP will also be discussed.Keywords: evaluation, forensic psychology, intervention programme, offender rehabilitation
Procedia PDF Downloads 59226145 SIRT1 Gene Polymorphisms and Its Protein Level in Colorectal Cancer
Authors: Olfat Shaker, Miriam Wadie, Reham Ali, Ayman Yosry
Abstract:
Colorectal cancer (CRC) is a major cause of mortality and morbidity and accounts for over 9% of cancer incidence worldwide. Silent information regulator 2 homolog 1 (SIRT1) gene is located in the nucleus and exert its effects via modulation of histone and non-histone targets. They function in the cell via histone deacetylase (HDAC) and/or adenosine diphosphate ribosyl transferase (ADPRT) enzymatic activity. The aim of this work was to study the relationship between SIRT1 polymorphism and its protein level in colorectal cancer patients in comparison to control cases. This study includes 2 groups: thirty healthy subjects (control group) & one hundred CRC patients. All subjects were subjected to: SIRT-1 serum level was measured by ELISA and gene polymorphisms of rs12778366, rs375891 and rs3740051 were detected by real time PCR. For CRC patients clinical data were collected (size, site of tumor as well as its grading, obesity) CRC patients showed high significant increase in the mean level of serum SIRT-1 compared to control group (P<0.001). Mean serum level of SIRT-1 showed high significant increase in patients with tumor size ≥5 compared to the size < 5 cm (P<0.05). In CRC patients, percentage of T allele of rs12778366 was significantly lower than controls, CC genotype and C allele C of rs 375891 were significantly higher than control group. In CRC patients, the CC genotype of rs12778366, was 75% in rectosigmoid and 25% in cecum & ascending colon. According to tumor size, the percentage of CC genotype was 87.5% in tumor size ≥5 cm. Conclusion: serum level of SIRT-1 and T allele, C allele of rs12778366 and rs 375891 respectively can be used as diagnostic markers for CRC patients.Keywords: CRC, SIRT1, polymorphisms, ELISA
Procedia PDF Downloads 21826144 Exploring Critical Thinking Skill Development in the 21st Century College Classroom: A Multi-Case Study
Authors: Kimberlyn Greene
Abstract:
Employers today expect college graduates to not only develop and demonstrate content-specific knowledge but also 21st century skillsets such as critical thinking. International assessments suggest students enrolled in United States (U.S.) educational institutions are underperforming in comparison to their global peers in areas such as critical thinking and technology. This multi-case study examined how undergraduate digital literacy courses at a four-year university in the U.S., as implemented by instructors, fostered students’ development of critical thinking skills. The conceptual framework for this study presumed that as students engaged in complex thinking within the context of a digital literacy course, their ability to deploy critical thinking was contingent upon whether the course was designed with the expectation for students to use critical thinking skills as well as the instructor’s approach to implementing the course. Qualitative data collected from instructor interviews, classroom observations, and course documents were analyzed with an emphasis on exploring the course design and instructional methods that provided opportunities to foster critical thinking skill development. Findings from the cross-case analysis revealed that although the digital literacy courses were designed and implemented with the expectation students would deploy critical thinking; there was no explicit support for students to develop these skills. The absence of intentional skill development resulted in inequitable opportunities for all students to engage in complex thinking. The implications of this study suggest that if critical thinking is to remain a priority, then universities must expand their support of pedagogical and instructional training for faculty regarding how to support students’ critical thinking skill development.Keywords: critical thinking skill development, curriculum design, digital literacy, pedagogy
Procedia PDF Downloads 29526143 A Study on Finite Element Modelling of Earth Retaining Wall Anchored by Deadman Anchor
Authors: K. S. Chai, S. H. Chan
Abstract:
In this paper, the earth retaining wall anchored by discrete deadman anchor to support excavations in sand is modelled and analysed by finite element analysis. A study is conducted to examine how deadman anchorage system helps in reducing the deflection of earth retaining wall. A simplified numerical model is suggested in order to reduce the simulation duration. A comparison between 3-D and 2-D finite element analyses is illustrated.Keywords: finite element, earth retaining wall, deadman anchor, sand
Procedia PDF Downloads 48226142 Comparison between Classical and New Direct Torque Control Strategies of Induction Machine
Authors: Mouna Essaadi, Mohamed Khafallah, Abdallah Saad, Hamid Chaikhy
Abstract:
This paper presents a comparative analysis between conventional direct torque control (C_DTC), Modified direct torque control (M_DTC) and twelve sectors direct torque control (12_DTC).Those different strategies are compared by simulation in term of torque, flux and stator current performances. Finally, a summary of the comparative analysis is presented.Keywords: C_DTC, M_DTC, 12_DTC, torque dynamic, stator current, flux, performances
Procedia PDF Downloads 61926141 Comparison of Adsorbents for Ammonia Removal from Mining Wastewater
Authors: F. Al-Sheikh, C. Moralejo, M. Pritzker, W. A. Anderson, A. Elkamel
Abstract:
Ammonia in mining wastewater is a significant problem, and treatment can be especially difficult in cold climates where biological treatment is not feasible. An adsorption process is one of the alternative processes that can be used to reduce ammonia concentrations to acceptable limits, and therefore a LEWATIT resin strongly acidic H+ form ion exchange resin and a Bowie Chabazite Na form AZLB-Na zeolite were tested to assess their effectiveness. For these adsorption tests, two packed bed columns (a mini-column constructed from a 32-cm long x 1-cm diameter piece of glass tubing, and a 60-cm long x 2.5-cm diameter Ace Glass chromatography column) were used containing varying quantities of the adsorbents. A mining wastewater with ammonia concentrations of 22.7 mg/L was fed through the columns at controlled flowrates. In the experimental work, maximum capacities of the LEWATIT ion exchange resin were 0.438, 0.448, and 1.472 mg/g for 3, 6, and 9 g respectively in a mini column and 1.739 mg/g for 141.5 g in a larger Ace column while the capacities for the AZLB-Na zeolite were 0.424, and 0.784 mg/g for 3, and 6 g respectively in the mini column and 1.1636 mg/g for 38.5 g in the Ace column. In the theoretical work, Thomas, Adams-Bohart, and Yoon-Nelson models were constructed to describe a breakthrough curve of the adsorption process and find the constants of the above-mentioned models. In the regeneration tests, 5% hydrochloric acid, HCl (v/v) and 10% sodium hydroxide, NaOH (w/v) were used to regenerate the LEWATIT resin and AZLB-Na zeolite with 44 and 63.8% recovery, respectively. In conclusion, continuous flow adsorption using a LEWATIT ion exchange resin and an AZLB-Na zeolite is efficient when using a co-flow technique for removal of the ammonia from wastewater. Thomas, Adams-Bohart, and Yoon-Nelson models satisfactorily fit the data with R2 closer to 1 in all cases.Keywords: AZLB-Na zeolite, continuous adsorption, Lewatit resin, models, regeneration
Procedia PDF Downloads 391