Search results for: mixed methods approach
21223 Characterization of Waste Thermocol Modified Bitumen by Spectroscopy, Microscopic Technique, and Dynamic Shear Rheometer
Authors: Supriya Mahida, Sangita, Yogesh U. Shah, Shanta Kumar
Abstract:
The global production of thermocol increasing day by day, due to vast applications of the use of thermocole in many sectors. Thermocol being non-biodegradable and more toxic than plastic leads towards a number of problems like its management into value-added products, environmental damage and landfill problems due to weight to volume ratio. Utilization of waste thermocol for modification of bitumen binders resulted in waste thermocol modified bitumen (WTMB) used in road construction and maintenance technology. Modification of bituminous mixes through incorporating thermocol into bituminous mixes through a dry process is one of the new options besides recycling process which consumes lots of waste thermocol. This process leads towards waste management and remedies against thermocol waste disposal. The present challenge is to dispose the thermocol waste under different forms in road infrastructure, either through the dry process or wet process to be developed in future. This paper focuses on the use of thermocol wastes which is mixed with VG 10 bitumen in proportions of 0.5%, 1%, 1.5%, and 2% by weight of bitumen. The physical properties of neat bitumen are evaluated and compared with modified VG 10 bitumen having thermocol. Empirical characterization like penetration, softening, and viscosity of bitumen has been carried out. Thermocol and waste thermocol modified bitumen (WTMB) were further analyzed by Fourier Transform Infrared Spectroscopy (FT-IR), field emission scanning electron microscopy (FESEM), and Dynamic Shear Rheometer (DSR).Keywords: DSR, FESEM, FT-IR, thermocol wastes
Procedia PDF Downloads 16721222 Application of Universal Distribution Factors for Real-Time Complex Power Flow Calculation
Authors: Abdullah M. Alodhaiani, Yasir A. Alturki, Mohamed A. Elkady
Abstract:
Complex power flow distribution factors, which relate line complex power flows to the bus injected complex powers, have been widely used in various power system planning and analysis studies. In particular, AC distribution factors have been used extensively in the recent power and energy pricing studies in free electricity market field. As was demonstrated in the existing literature, many of the electricity market related costing studies rely on the use of the distribution factors. These known distribution factors, whether the injection shift factors (ISF’s) or power transfer distribution factors (PTDF’s), are linear approximations of the first order sensitivities of the active power flows with respect to various variables. This paper presents a novel model for evaluating the universal distribution factors (UDF’s), which are appropriate for an extensive range of power systems analysis and free electricity market studies. These distribution factors are used for the calculations of lines complex power flows and its independent of bus power injections, they are compact matrix-form expressions with total flexibility in determining the position on the line at which line flows are measured. The proposed approach was tested on IEEE 9-Bus system. Numerical results demonstrate that the proposed approach is very accurate compared with exact method.Keywords: distribution factors, power system, sensitivity factors, electricity market
Procedia PDF Downloads 47321221 Creating a Child Friendly Environment as a Curriculum Model for Early Years Teaching
Authors: Undiyaundeye Florence Atube, Ugar Innocent A.
Abstract:
Young children are active learners who use all their senses to build concepts and ideas from their experiences. The process of learning, the content and the outcomes, is vital for young children. They need time to explore whether they are satisfied with what is learnt. Of all levels of education, early childhood education is considered to be most critical for the social, emotional, cognitive and physical development. For this reason, the teachers for early years need to play a significant role in the teaching and learning process through the provision of a friendly environment in the school. A case study approach was used in this study. The information was gathered through various methods like class observation, field notes, documents analysis, group processes, and semi structured interviews. The group processes participants and interviewees were taken from some stakeholders such as parents, students, teachers, and head teachers from public schools, to have a broad and comprehensive analysis, informal interaction with different stakeholders and self-reflection was used to clarify aspects of varying issues and findings. The teachers’ roles in developing a child friendly environment in personal capacity to learning were found to improve a pupils learning ability. Prior to early child development education, learning experiences and pedagogical content knowledge played a vital role in engaging teachers in developing their thinking and teaching practice. Children can be helped to develop independent self-control and self-reliance with careful planning and development of the child’s experience with sensitive and appropriate interaction by the educator to propel eagerness to learn through the provision of a friendly environment.Keywords: child friendly environment, early childhood, education and development, teaching, learning and the curriculum
Procedia PDF Downloads 37421220 Ozone Therapy for Disc Herniation: A Non-surgical Option
Authors: Shahzad Karim Bhatti
Abstract:
Background: Ozone is a combination of oxygen and can be used in treatment of low back pain due to herniated disc. It is a minimally invasive procedure using biochemical properties of ozone resulting in reduced volume of disc and inflammation resulting in significant pain relief. Aim: The purpose of this study was to evaluate the effectiveness of ozone therapy in combination with peri-ganglionic injection of local anesthetic and corticosteroid. Material and Methods: This retrospective study was done at the Interventional Radiology Department of Mayo Hospital, Lahore. A total of 49000 patients were included from January 2008 to March 2022. All the patients presented with clinical signs and symptoms of lumber disc herniation, which was confirmed by a MRI scan of the lumbar sacral spine. The pain reduction was calculated using modified MacNab method. All the patients underwent percutaneous injection of ozone at a concentration of 27 micrograms/ml to lumber disc under fluoroscopic guidance with combination of local anesthetic and corticosteroid in peri-ganglionic space. Results were evaluated by two expert observers who were blinded to patient treatment. Results A satisfactory therapeutic outcome was obtained. 55% of the patients showed complete recovery with resolution of symptoms. 20% of the patients complained of occasional episodic pain with no limitation of occupational activity. 15% of cases showed insufficient improvement. 5% of cases had insufficient improvement and went for surgery. 10% of cases never turned up after the first visit. Conclusion Intradiscal ozone for the treatment of herniated discs has revolutionized percutaneous approach to nerve root compression making it safer, economical and easier to repeat without any side effects than treatments currently used in Pakistan.Keywords: pain, prolapse, Ozone, backpain
Procedia PDF Downloads 2721219 Monetary Evaluation of Dispatching Decisions in Consideration of Choice of Transport
Authors: Marcel Schneider, Nils Nießen
Abstract:
Microscopic simulation programs enable the description of the two processes of railway operation and the previous timetabling. Occupation conflicts are often solved based on defined train priorities on both process levels. These conflict resolutions produce knock-on delays for the involved trains. The sum of knock-on delays is commonly used to evaluate the quality of railway operations. It is either compared to an acceptable level-of-service or the delays are evaluated economically by linearly monetary functions. It is impossible to properly evaluate dispatching decisions without a well-founded objective function. This paper presents a new approach for evaluation of dispatching decisions. It uses models of choice of transport and considers the behaviour of the end-costumers. These models evaluate the knock-on delays in more detail than linearly monetary functions and consider other competing modes of transport. The new approach pursues the coupling of a microscopic model of railway operation with the macroscopic model of choice of transport. First it will be implemented for the railway operations process, but it can also be used for timetabling. The evaluation considers the possibility to change over to other transport modes by the end-costumers. The new approach first looks at the rail-mounted and road transport, but it can also be extended to air transport. The split of the end-costumers is described by the modal-split. The reactions by the end-costumers have an effect on the revenues of the railway undertakings. Various travel purposes has different pavement reserves and tolerances towards delays. Longer journey times affect besides revenue changes also additional costs. The costs depend either on time or track and arise from circulation of workers and vehicles. Only the variable values are summarised in the contribution margin, which is the base for the monetary evaluation of the delays. The contribution margin is calculated for different resolution decisions of the same conflict. The conflict resolution is improved until the monetary loss becomes minimised. The iterative process therefore determines an optimum conflict resolution by observing the change of the contribution margin. Furthermore, a monetary value of each dispatching decision can also be determined.Keywords: choice of transport, knock-on delays, monetary evaluation, railway operations
Procedia PDF Downloads 32821218 Harnessing the Power of Large Language Models in Orthodontics: AI-Generated Insights on Class II and Class III Orthopedic Appliances: A Cross-Sectional Study
Authors: Laiba Amin, Rashna H. Sukhia, Mubassar Fida
Abstract:
Introduction: This study evaluates the accuracy of responses from ChatGPT, Google Bard, and Microsoft Copilot regarding dentofacial orthopedic appliances. As artificial intelligence (AI) increasingly enhances various fields, including healthcare, understanding its reliability in specialized domains like orthodontics becomes crucial. By comparing the accuracy of different AI models, this study aims to shed light on their effectiveness and potential limitations in providing technical insights. Materials and Methods: A total of 110 questions focused on dentofacial orthopedic appliances were posed to each AI model. The responses were then evaluated by five experienced orthodontists using a modified 5-point Likert scale to ensure a thorough assessment of accuracy. This structured approach allowed for consistent and objective rating, facilitating a meaningful comparison between the AI systems. Results: The results revealed that Google Bard demonstrated the highest accuracy at 74%, followed by Microsoft Copilot, with an accuracy of 72.2%. In contrast, ChatGPT was found to be the least accurate, achieving only 52.2%. These results highlight significant differences in the performance of the AI models when addressing orthodontic queries. Conclusions: Our study highlights the need for caution in relying on AI for orthodontic insights. The overall accuracy of the three chatbots was 66%, with Google Bard performing best for removable Class II appliances. Microsoft Copilot was more accurate than ChatGPT, which, despite its popularity, was the least accurate. This variability emphasizes the importance of human expertise in interpreting AI-generated information. Further research is necessary to improve the reliability of AI models in specialized healthcare settings.Keywords: artificial intelligence, large language models, orthodontics, dentofacial orthopaedic appliances, accuracy assessment.
Procedia PDF Downloads 821217 High-Fidelity Materials Screening with a Multi-Fidelity Graph Neural Network and Semi-Supervised Learning
Authors: Akeel A. Shah, Tong Zhang
Abstract:
Computational approaches to learning the properties of materials are commonplace, motivated by the need to screen or design materials for a given application, e.g., semiconductors and energy storage. Experimental approaches can be both time consuming and costly. Unfortunately, computational approaches such as ab-initio electronic structure calculations and classical or ab-initio molecular dynamics are themselves can be too slow for the rapid evaluation of materials, often involving thousands to hundreds of thousands of candidates. Machine learning assisted approaches have been developed to overcome the time limitations of purely physics-based approaches. These approaches, on the other hand, require large volumes of data for training (hundreds of thousands on many standard data sets such as QM7b). This means that they are limited by how quickly such a large data set of physics-based simulations can be established. At high fidelity, such as configuration interaction, composite methods such as G4, and coupled cluster theory, gathering such a large data set can become infeasible, which can compromise the accuracy of the predictions - many applications require high accuracy, for example band structures and energy levels in semiconductor materials and the energetics of charge transfer in energy storage materials. In order to circumvent this problem, multi-fidelity approaches can be adopted, for example the Δ-ML method, which learns a high-fidelity output from a low-fidelity result such as Hartree-Fock or density functional theory (DFT). The general strategy is to learn a map between the low and high fidelity outputs, so that the high-fidelity output is obtained a simple sum of the physics-based low-fidelity and correction, Although this requires a low-fidelity calculation, it typically requires far fewer high-fidelity results to learn the correction map, and furthermore, the low-fidelity result, such as Hartree-Fock or semi-empirical ZINDO, is typically quick to obtain, For high-fidelity outputs the result can be an order of magnitude or more in speed up. In this work, a new multi-fidelity approach is developed, based on a graph convolutional network (GCN) combined with semi-supervised learning. The GCN allows for the material or molecule to be represented as a graph, which is known to improve accuracy, for example SchNet and MEGNET. The graph incorporates information regarding the numbers of, types and properties of atoms; the types of bonds; and bond angles. They key to the accuracy in multi-fidelity methods, however, is the incorporation of low-fidelity output to learn the high-fidelity equivalent, in this case by learning their difference. Semi-supervised learning is employed to allow for different numbers of low and high-fidelity training points, by using an additional GCN-based low-fidelity map to predict high fidelity outputs. It is shown on 4 different data sets that a significant (at least one order of magnitude) increase in accuracy is obtained, using one to two orders of magnitude fewer low and high fidelity training points. One of the data sets is developed in this work, pertaining to 1000 simulations of quinone molecules (up to 24 atoms) at 5 different levels of fidelity, furnishing the energy, dipole moment and HOMO/LUMO.Keywords: .materials screening, computational materials, machine learning, multi-fidelity, graph convolutional network, semi-supervised learning
Procedia PDF Downloads 4121216 Enhanced Growth and Innate Immune Response in Scylla serrata Fed Additives Containing Citrus microcarpa and Euphorbia hirta
Authors: Kaye Angelica Lacurom, Keziah Macahilo
Abstract:
One of the most important and in demand products in the Philippines is Scylla serrata. Despite the increasing demand in the market today, the cost of feeds corresponds to a fraction of 40%-50% of the entire operational of crab production. Raisers and suppliers are seeking alternative ways to lessen their expense with more effective enhancers than the usual feeds. This study aimed to enhance the growth and immune system of the mud crabs using natural antioxidants from plant powders that are available in the locality. There were four treatments: Diet 1: commercially available feeds for the positive control, Diet 2: 1,200 mg/kg Euphorbia hirta , Diet 3: 1,600 mg/kg of Citrus microcarpa, Diet 4: Mixed 1,400 of Euphorbia hirta and Citrus microcarpa. Air-drying was done first-hand followed by the grinding of plants. After which the plants were stored in a container and was added to the feed formulation given. Mud crabs were fed twice a day for 30 days for better results. For inferential analysis, weight gain and survivability were measured, hemolymph was extracted and the Total Hemocycte Count (THC) was determined analyzed. Results showed that the highest THC mean (9.0 x 105 ± 7.1 x 104) and weight gain mean (2.9 x 10± 1.9 x 10) was achieved by Diet 3 with the same survivability rates among other treatments and positive control. While Diet 2 presented the lowest THC mean (7.2 x 105 ±3.5 x 104) and weight gain mean (1.0 x 10± 7.0 x 10-1).Keywords: fed additives, Scylla serrata, enhanced growth, innate immune response
Procedia PDF Downloads 13821215 DISGAN: Efficient Generative Adversarial Network-Based Method for Cyber-Intrusion Detection
Authors: Hongyu Chen, Li Jiang
Abstract:
Ubiquitous anomalies endanger the security of our system con- stantly. They may bring irreversible damages to the system and cause leakage of privacy. Thus, it is of vital importance to promptly detect these anomalies. Traditional supervised methods such as Decision Trees and Support Vector Machine (SVM) are used to classify normality and abnormality. However, in some case, the abnormal status are largely rarer than normal status, which leads to decision bias of these methods. Generative adversarial network (GAN) has been proposed to handle the case. With its strong generative ability, it only needs to learn the distribution of normal status, and identify the abnormal status through the gap between it and the learned distribution. Nevertheless, existing GAN-based models are not suitable to process data with discrete values, leading to immense degradation of detection performance. To cope with the discrete features, in this paper, we propose an efficient GAN-based model with specifically-designed loss function. Experiment results show that our model outperforms state-of-the-art models on discrete dataset and remarkably reduce the overhead.Keywords: GAN, discrete feature, Wasserstein distance, multiple intermediate layers
Procedia PDF Downloads 12921214 Student Loan Debt among Students with Disabilities
Authors: Kaycee Bills
Abstract:
This study will determine if students with disabilities have higher student loan debt payments than other student populations. The hypothesis was that students with disabilities would have significantly higher student loan debt payments than other students due to the length of time they spend in school. Using the Bachelorette and Beyond Study Wave 2015/017 dataset, quantitative methods were employed. These data analysis methods included linear regression and a correlation matrix. Due to the exploratory nature of the study, the significance levels for the overall model and each variable were set at .05. The correlation matrix demonstrated that students with certain types of disabilities are more likely to fall under higher student loan payment brackets than students without disabilities. These results also varied among the different types of disabilities. The result of the overall linear regression model was statistically significant (p = .04). Despite the overall model being statistically significant, the majority of the significance values for the different types of disabilities were null. However, several other variables had statistically significant results, such as veterans, people of minority races, and people who attended private schools. Implications for how this impacts the economy, capitalism, and financial wellbeing of various students are discussed.Keywords: disability, student loan debt, higher education, social work
Procedia PDF Downloads 16821213 Utilising Sociodrama as Classroom Intervention to Develop Sensory Integration in Adolescents who Present with Mild Impaired Learning
Authors: Talita Veldsman, Elzette Fritz
Abstract:
Many children attending special education present with sensory integration difficulties that hamper their learning and behaviour. These learners can benefit from therapeutic interventions as part of their classroom curriculum that can address sensory development and allow for holistic development to take place. A research study was conducted by utilizing socio-drama as a therapeutic intervention in the classroom in order to develop sensory integration skills. The use of socio-drama as therapeutic intervention proved to be a successful multi-disciplinary approach where education and psychology could build a bridge of growth and integration. The paper describes how socio-drama was used in the classroom and how these sessions were designed. The research followed a qualitative approach and involved six Afrikaans-speaking children attending special secondary school in the age group 12-14 years. Data collection included observations during the session, reflective art journals, semi-structured interviews with the teacher and informal interviews with the adolescents. The analysis found improved self-confidence, better social relationships, sensory awareness and self-regulation in the participants after a period of a year.Keywords: education, sensory integration, sociodrama, classroom intervention, psychology
Procedia PDF Downloads 57821212 Tyrosine Rich Fraction as an Immunomodulatory Agent from Ficus Religiosa Bark
Authors: S. A. Nirmal, G. S. Asane, S. C. Pal, S. C. Mandal
Abstract:
Objective: Ficus religiosa Linn (Moraceae) is being used in traditional medicine to improve immunity hence present work was undertaken to validate this use scientifically. Material and Methods: Dried, powdered bark of F. religiosa was extracted successively using petroleum ether and 70% ethanol in soxhlet extractor. The extracts obtained were screened for immunomodulatory activity by delayed type hypersensitivity (DTH), neutrophil adhesion test and cyclophosphamide-induced neutropenia in Swiss albino mice at the dose of 50 and 100 mg/kg, i.p. 70% ethanol extract showed significant immunostimulant activity hence subjected to column chromatography to produce tyrosine rich fraction (TRF). TRF obtained was screened for immunomodulatory activity by above methods at the dose of 10 mg/kg, i.p. Results: TRF showed potentiation of DTH response in terms of significant increase in the mean difference in foot-pad thickness and it significantly increased neutrophil adhesion to nylon fibers by 48.20%. Percentage reduction in total leukocyte count and neutrophil by TRF was found to be 43.85% and 18.72%, respectively. Conclusion: Immunostimulant activity of TRF was more pronounced and thus it has great potential as a source for natural health products.Keywords: Ficus religiosa, immunomodulatory, cyclophosphamide, neutropenia
Procedia PDF Downloads 44621211 An Accurate Computation of 2D Zernike Moments via Fast Fourier Transform
Authors: Mohammed S. Al-Rawi, J. Bastos, J. Rodriguez
Abstract:
Object detection and object recognition are essential components of every computer vision system. Despite the high computational complexity and other problems related to numerical stability and accuracy, Zernike moments of 2D images (ZMs) have shown resilience when used in object recognition and have been used in various image analysis applications. In this work, we propose a novel method for computing ZMs via Fast Fourier Transform (FFT). Notably, this is the first algorithm that can generate ZMs up to extremely high orders accurately, e.g., it can be used to generate ZMs for orders up to 1000 or even higher. Furthermore, the proposed method is also simpler and faster than the other methods due to the availability of FFT software and/or hardware. The accuracies and numerical stability of ZMs computed via FFT have been confirmed using the orthogonality property. We also introduce normalizing ZMs with Neumann factor when the image is embedded in a larger grid, and color image reconstruction based on RGB normalization of the reconstructed images. Astonishingly, higher-order image reconstruction experiments show that the proposed methods are superior, both quantitatively and subjectively, compared to the q-recursive method.Keywords: Chebyshev polynomial, fourier transform, fast algorithms, image recognition, pseudo Zernike moments, Zernike moments
Procedia PDF Downloads 26521210 Functional Connectivity Signatures of Polygenic Depression Risk in Youth
Authors: Louise Moles, Steve Riley, Sarah D. Lichenstein, Marzieh Babaeianjelodar, Robert Kohler, Annie Cheng, Corey Horien Abigail Greene, Wenjing Luo, Jonathan Ahern, Bohan Xu, Yize Zhao, Chun Chieh Fan, R. Todd Constable, Sarah W. Yip
Abstract:
Background: Risks for depression are myriad and include both genetic and brain-based factors. However, relationships between these systems are poorly understood, limiting understanding of disease etiology, particularly at the developmental level. Methods: We use a data-driven machine learning approach connectome-based predictive modeling (CPM) to identify functional connectivity signatures associated with polygenic risk scores for depression (DEP-PRS) among youth from the Adolescent Brain and Cognitive Development (ABCD) study across diverse brain states, i.e., during resting state, during affective working memory, during response inhibition, during reward processing. Results: Using 10-fold cross-validation with 100 iterations and permutation testing, CPM identified connectivity signatures of DEP-PRS across all examined brain states (rho’s=0.20-0.27, p’s<.001). Across brain states, DEP-PRS was positively predicted by increased connectivity between frontoparietal and salience networks, increased motor-sensory network connectivity, decreased salience to subcortical connectivity, and decreased subcortical to motor-sensory connectivity. Subsampling analyses demonstrated that model accuracies were robust across random subsamples of N’s=1,000, N’s=500, and N’s=250 but became unstable at N’s=100. Conclusions: These data, for the first time, identify neural networks of polygenic depression risk in a large sample of youth before the onset of significant clinical impairment. Identified networks may be considered potential treatment targets or vulnerability markers for depression risk.Keywords: genetics, functional connectivity, pre-adolescents, depression
Procedia PDF Downloads 5821209 The Advantages of Using DNA-Barcoding for Determining the Fraud in Seafood
Authors: Elif Tugce Aksun Tumerkan
Abstract:
Although seafood is an important part of human diet and categorized highly traded food industry internationally, it is remain overlooked generally in the global food security aspect. Food product authentication is the main interest in the aim of both avoids commercial fraud and to consider the risks that might be harmful to human health safety. In recent years, with increasing consumer demand for regarding food content and it's transparency, there are some instrumental analyses emerging for determining food fraud depend on some analytical methodologies such as proteomic and metabolomics. While, fish and seafood consumed as fresh previously, within advanced technology, processed or packaged seafood consumption have increased. After processing or packaging seafood, morphological identification is impossible when some of the external features have been removed. The main fish and seafood quality-related issues are the authentications of seafood contents such as mislabelling products which may be contaminated and replacement partly or completely, by lower quality or cheaper ones. For all mentioned reasons, truthful consistent and easily applicable analytical methods are needed for assurance the correct labelling and verifying of seafood products. DNA-barcoding methods become popular robust that used in taxonomic research for endangered or cryptic species in recent years; they are used for determining food traceability also. In this review, when comparing the other proteomic and metabolic analysis, DNA-based methods are allowing a chance to identification all type of food even as raw, spiced and processed products. This privilege caused by DNA is a comparatively stable molecule than protein and other molecules. Furthermore showing variations in sequence based on different species and founding in all organisms, make DNA-based analysis more preferable. This review was performed to clarify the main advantages of using DNA-barcoding for determining seafood fraud among other techniques.Keywords: DNA-barcoding, genetic analysis, food fraud, mislabelling, packaged seafood
Procedia PDF Downloads 16821208 A Portable Cognitive Tool for Engagement Level and Activity Identification
Authors: Terry Teo, Sun Woh Lye, Yufei Li, Zainuddin Zakaria
Abstract:
Wearable devices such as Electroencephalography (EEG) hold immense potential in the monitoring and assessment of a person’s task engagement. This is especially so in remote or online sites. Research into its use in measuring an individual's cognitive state while performing task activities is therefore expected to increase. Despite the growing number of EEG research into brain functioning activities of a person, key challenges remain in adopting EEG for real-time operations. These include limited portability, long preparation time, high number of channel dimensionality, intrusiveness, as well as level of accuracy in acquiring neurological data. This paper proposes an approach using a 4-6 EEG channels to determine the cognitive states of a subject when undertaking a set of passive and active monitoring tasks of a subject. Air traffic controller (ATC) dynamic-tasks are used as a proxy. The work found that when using the channel reduction and identifier algorithm, good trend adherence of 89.1% can be obtained between a commercially available BCI 14 channel Emotiv EPOC+ EEG headset and that of a carefully selected set of reduced 4-6 channels. The approach can also identify different levels of engagement activities ranging from general monitoring ad hoc and repeated active monitoring activities involving information search, extraction, and memory activities.Keywords: assessment, neurophysiology, monitoring, EEG
Procedia PDF Downloads 7621207 Political Economy of Electronic News Media in Pakistan
Authors: Asad Ullah Khalid
Abstract:
This paper encompasses the application of the concept of political economy of mass media in Pakistan. The media has developed at a massive pace and now is considered as one of the vital parts in having better administration furthermore helps in conveying the issues identified with the government to the public. Albeit Pakistani media has gained much independence after 2003 but there are many social, political and economy factors which influence the content of the media. The study employs triangulation of quantitative and qualitative methods. In terms of methods, content analysis and interview method both are used. The content of Pakistani media is analyzed quantitatively and qualitatively. Moreover, interviews with various journalists are conducted, and their findings are disclosed in this paper. Pakistan's communication landscape is neither well documented nor well understood, leaving its public off guard with regards to reviewing the role and impact of news inflow, correspondence and media in political, economic and social life. It has been found out that on particular issues some media channels have strong affiliations with certain political parties, moreover reporting and coverage have also been affected by the factors like terrorism, state policies(written and verbal), advertising/economic and demographic factors like the composition of the population.Keywords: political economy, news media, Pakistan, electronic news media, journalism, mass media
Procedia PDF Downloads 33121206 Corrosion Resistance Performance of Epoxy/Polyamidoamine Coating Due to Incorporation of Nano Aluminium Powder
Authors: Asiful Hossain Seikh, Mohammad Asif Alam, Ubair Abdus Samad, Jabair A. Mohammed, S. M. Al-Zahrani, El-Sayed M. Sherif
Abstract:
In this current investigation, aliphatic amine-cured diglycidyl ether of bisphenol-A (DGEBA) based epoxy coating was mixed with certain weight % hardener polyaminoamide (1:2) and was coated on carbon steel panels with and without 1% nano crystalline Al powder. The corrosion behavior of the coated samples were investigated by exposing them in the salt spray chamber, for 500 hours. According to ASTM-B-117, the bath was kept at 35 °C and 5% NaCl containing mist was sprayed at 1.3 bars pressure. Composition of coatings was confirmed using Fourier-transform infrared spectroscopy (FTIR). Electrochemical characterization of the coated samples was also performed using potentiodynamic polarization technique and electrochemical impedance spectroscopy (EIS) technique. All the experiments were done in 3.5% NaCl solution. The nano Al coated sample shows good corrosion resistance property compared to bare Al sample. In fact after salt spray exposure no pitting or local damage was observed for nano coated sample and the coating gloss was negligibly affected. The surface morphology of coated and corroded samples was studied using scanning electron microscopy (SEM).Keywords: epoxy, nano aluminium, potentiodynamic polarization, salt spray, electrochemical impedence spectroscopy
Procedia PDF Downloads 16921205 Development of E-Tendering Models for Nigerian Public Procuring Entities
Authors: Bello Abdullahi, Kabir Bala, Yahaya M. Ibrahim, Ahmed D. Ibrahim
Abstract:
Public sector tendering has traditionally been conducted using manual paper-based processes which are known to be inefficient, less transparent, and more prone to manipulations and errors. However, the advent of the Internet and its associated technologies has led to the development of numerous e-Tendering systems that addressed many of the problems associated with the manual paper-based tendering system. Currently, in Nigeria, the public tendering processes are largely conducted based on manual paper-based system that is bedevilled by a number of problems such as inordinate delays, inefficiencies, manipulation of the tender evaluation process, corruption, lack of transparency and competition, among other problems. These problems can be addressed through the adoption of existing web-based e-Tendering systems which are known to address most of these problems. However, these existing e-Tendering systems that have been developed are not based on the Nigerian legal procurement processes and as such their suitability for local application is very limited. This paper is part of a larger study that attempt to address this problem through the development of an e-Tendering system that is based on the requirements of the Nigerian public procuring entities. In this paper, the identified tendering processes commonly used by Nigerian public procuring entities in the selection of construction sources are presented. A multi-methods research approach was used to identify those tendering processes. Specifically, 19 existing business use cases used by Nigerian public procuring entities were identified and 61 system use cases were prescribed based on the identified business use cases. The use cases were used as the basis for the development of domain and software conceptual models. The models were successfully used to guide the development of an e-Tendering system called NPS-eTender. Ripple and Unified Process were adopted as the software development methodologies.Keywords: e-tendering, e-procurement, requirement model, conceptual model, public sector tendering, public procurement
Procedia PDF Downloads 19521204 Evaluation of Diagnostic Values of Culture, Rapid Urease Test, and Histopathology in the Diagnosis of Helicobacter pylori Infection and in vitro Effects of Various Antimicrobials against Helicobacter pylori
Authors: Recep Kesli, Huseyin Bilgin, Yasar Unlu, Gokhan Gungor
Abstract:
Aim: The aim of this study, was to investigate the presence of Helicobacter pylori (H. pylori) infection by culture, histology, and RUT (Rapid Urease Test) in gastric antrum biopsy samples taken from patients presented with dyspeptic complaints and to determine resistance rates of amoxicillin, clarithromycin, levofloxacin and metronidazole against the H. pylori strains by E-test. Material and Methods: A total of 278 patients who admitted to Konya Education and Research Hospital Department of Gastroenterology with dyspeptic complaints, between January 2011-July 2013, were included in the study. Microbiological and histopathological examinations of biopsy specimens taken from antrum and corpus regions were performed. The presence of H. pylori in biopsy samples was investigated by culture (Portagerm pylori-PORT PYL, Pylori agar-PYL, GENbox microaer, bioMerieux, France), histology (Giemsa, Hematoxylin and Eosin staining), and RUT(CLOtest, Cimberly-Clark, USA). Antimicrobial resistance of isolates against amoxicillin, clarithromycin, levofloxacin, and metronidazole was determined by E-test method (bioMerieux, France). As a gold standard in the diagnosis of H. pylori; it was accepted that the culture method alone was positive or both histology and RUT were positive together. Sensitivity and specificity for histology and RUT were calculated by taking the culture as a gold standard. Sensitivity and specificity for culture were also calculated by taking the co-positivity of both histology and RUT as a gold standard. Results: H. pylori was detected in 140 of 278 of patients with culture and 174 of 278 of patients with histology in the study. H. pylori positivity was also found in 191 patients with RUT. According to the gold standard criteria, a false negative result was found in 39 cases by culture method, 17 cases by histology, and 8 cases by RUT. Sensitivity and specificity of the culture, histology, and RUT methods of the patients were 76.5 % and 88.3 %, 87.8 % and 63 %, 94.2 % and 57.2 %, respectively. Antibiotic resistance was investigated by E-test in 140 H. pylori strains isolated from culture. The resistance rates of H. pylori strains to the amoxicillin, clarithromycin, levofloxacin, and metronidazole was detected as 9 (6.4 %), 22 (15.7 %), 17 (12.1 %), 57 (40.7 %), respectively. Conclusion: In our study, RUT was found to be the most sensitive, culture was the most specific test between culture, histology, and RUT methods. Although we detected the specificity of the culture method as high, its sensitivity was found to be quite low compared to other methods. The low sensitivity of H. pylori culture may be caused by the factors affect the chances of direct isolation such as spoild bacterium, difficult-to-breed microorganism, clinical sample retrieval, and transport conditions.Keywords: antimicrobial resistance, culture, histology, H. pylori, RUT
Procedia PDF Downloads 16321203 Application of Electrical Resistivity, Induced Polarization and Statistical Methods in Chichak Iron Deposit Exploration
Authors: Shahrzad Maghsoodi, Hamid Reza Ranazi
Abstract:
This paper is devoted to exploration of Chichak (hematite) deposit, using electrical resistivity, chargeability and statistical methods. Chichak hematite deposit is located in Chichak area west Azarbaijan, northwest of Iran. There are some outcrops of hematite bodies in the area. The goal of this study was to identify the depth, thickness and shape of these bodies and to explore other probabile hematite bodies. Therefore nine profiles were considered to be surveyed by RS and IP method by utilizing an innovative electrode array so called CRSP (Combined Resistivity Sounding and Profiling). IP and RS sections were completed along each profile. In addition, the RS and IP data were analyzed and relation between these two variables was determined by statistical tools. Finally, hematite bodies were identified in each of the sections. The results showed that hematite bodies have a resistivity lower than 125 Ωm and very low chargeability, lower than 8 mV⁄V. After geophysical study some points were proposed for drilling, results obtained from drilling confirm the geophysical results.Keywords: Hematite deposit, Iron exploration, Electrical resistivity, Chargeability, Iran, Chichak, Statistical, CRSP electrodes array
Procedia PDF Downloads 7821202 Using Geopolymer Technology on Stabilization and Reutilization the Expansion Behavior Slag
Authors: W. H. Lee, T. W. Cheng, K. Y. Lin, S. W. Huang, Y. C. Ding
Abstract:
Basic Oxygen Furnace (BOF) Slag and electric arc furnace (EAF) slag is the by-product of iron making and steel making. Each of slag with produced over 100 million tons annually in Taiwan. The type of slag has great engineering properties, such as, high hardness and density, high compressive strength, low abrasion ratio, and can replace natural aggregate for building materials. However, no matter BOF or EAF slag, both have the expansion problem, due to it contains free lime. The purpose of this study was to stabilize the BOF and EAF slag by using geopolymer technology, hoping can prevent and solve the expansion problem. The experimental results showed that using geopolymer technology can successfully solve and prevent the expansion problem. Their main properties are analyzed with regard to their use as building materials. Autoclave is used to study the volume stability of these specimens. Finally, the compressive strength of geopolymer mortar with BOF/FAF slag can be reached over 21MPa after curing for 28 days. After autoclave testing, the volume expansion does not exceed 0.2%. Even after the autoclave test, the compressive strength can be grown to over 35MPa. In this study have success using these results on ready-mixed concrete plant, and have the same experimental results as laboratory scale. These results gave encouragement that the stabilized and reutilized BOF/EAF slag could be replaced as a feasible natural fine aggregate by using geopolymer technology.Keywords: BOF slag, EAF slag, autoclave test, geopolymer
Procedia PDF Downloads 13321201 Hydraulic Optimization of an Adjustable Spiral-Shaped Evaporator
Authors: Matthias Feiner, Francisco Javier Fernández García, Michael Arneman, Martin Kipfmüller
Abstract:
To ensure reliability in miniaturized devices or processes with increased heat fluxes, very efficient cooling methods have to be employed in order to cope with small available cooling surfaces. To address this problem, a certain type of evaporator/heat exchanger was developed: It is called a swirl evaporator due to its flow characteristic. The swirl evaporator consists of a concentrically eroded screw geometry in which a capillary tube is guided, which is inserted into a pocket hole in components with high heat load. The liquid refrigerant R32 is sprayed through the capillary tube to the end face of the blind hole and is sucked off against the injection direction in the screw geometry. Its inner diameter is between one and three millimeters. The refrigerant is sprayed into the pocket hole via a small tube aligned in the center of the bore hole and is sucked off on the front side of the hole against the direction of injection. The refrigerant is sucked off in a helical geometry (twisted flow) so that it is accelerated against the hot wall (centrifugal acceleration). This results in an increase in the critical heat flux of up to 40%. In this way, more heat can be dissipated on the same surface/available installation space. This enables a wide range of technical applications. To optimize the design for the needs in various fields of industry, like the internal tool cooling when machining nickel base alloys like Inconel 718, a correlation-based model of the swirl-evaporator was developed. The model is separated into 3 subgroups with overall 5 regimes. The pressure drop and heat transfer are calculated separately. An approach to determine the locality of phase change in the capillary and the swirl was implemented. A test stand has been developed to verify the simulation.Keywords: helically-shaped, oil-free, R-32, swirl-evaporator, twist-flow
Procedia PDF Downloads 10821200 Play-Based Intervention Training Program for Daycare Workers Attending to Children with Autism
Authors: Raymond E. Raguindin
Abstract:
Objective: This research studied the teaching improvement of daycare workers in imitation, joint attention, and language activities using the play-based early intervention training program in Cabanatuan City, Nueva Ecija. Methods: Focus group discussions were developed to explore the attitude, beliefs, and practices of daycare workers. Results: Findings of the study revealed that daycare workers have existing knowledge and experience in teaching children with autism. Their workshops on managing inappropriate behaviors of children with autism resulting in a general positive perception of accepting and teaching children with autism in daycare centers. Play based activities were modelled and participated in by daycare workers. These include demonstration, modelling, prompting and providing social reinforcers as reward. Five lectures and five training days were done to implement the training program. Daycare workers’ levels of skill in teaching imitation, joint attention and language were gathered before and after the participation in the training program. Findings suggest significant differences between pre-test and post test scores. They have shown significant improvement in facilitating imitation, joint attention, and language children with autism after the play-based early intervention training. They were able to initiate and sustain imitation, joint attention, and language activities with adequate knowledge and confidence. Conclusions: 1. Existing attitudes and beliefs greatly influenced the positive delivery mode of instruction. 2. Teacher-directed approach to improve attention, imitation, joint attention, and language of children with autism can be acquired by daycare workers. 3. Teaching skills and experience can be used as reference and basis for identifying future training needs.Keywords: early intervention, imitation, joint attention, language
Procedia PDF Downloads 12021199 Main Tendencies of Youth Unemployment and the Regulation Mechanisms for Decreasing Its Rate in Georgia
Authors: Nino Paresashvili, Nino Abesadze
Abstract:
The modern world faces huge challenges. Globalization changed the socio-economic conditions of many countries. The current processes in the global environment have a different impact on countries with different cultures. However, an alleviation of poverty and improvement of living conditions is still the basic challenge for the majority of countries, because much of the population still lives under the official threshold of poverty. It is very important to stimulate youth employment. In order to prepare young people for the labour market, it is essential to provide them with the appropriate professional skills and knowledge. It is necessary to plan efficient activities for decreasing an unemployment rate and for developing the perfect mechanisms for regulation of a labour market. Such planning requires thorough study and analysis of existing reality, as well as development of corresponding mechanisms. Statistical analysis of unemployment is one of the main platforms for regulation of the labour market key mechanisms. The corresponding statistical methods should be used in the study process. Such methods are observation, gathering, grouping, and calculation of the generalized indicators. Unemployment is one of the most severe socioeconomic problems in Georgia. According to the past as well as the current statistics, unemployment rates always have been the most problematic issue to resolve for policy makers. Analytical works towards to the above-mentioned problem will be the basis for the next sustainable steps to solve the main problem. The results of the study showed that the choice of young people is not often due to their inclinations, their interests and the labour market demand. That is why the wrong professional orientation of young people in most cases leads to their unemployment. At the same time, it was shown that there are a number of professions in the labour market with a high demand because of the deficit the appropriate specialties. To achieve healthy competitiveness in youth employment, it is necessary to formulate regional employment programs with taking into account the regional infrastructure specifications.Keywords: unemployment, analysis, methods, tendencies, regulation mechanisms
Procedia PDF Downloads 37821198 Dental Education in Brazil: A Systematic Literature Review
Authors: Fabiane Alves Farias Guimarães, Rodrigo Otávio Moretti-Pires, Ana Lúcia Schaefer Ferreira de Mello
Abstract:
Introduction: Considering the last changes in Brazilian Health and Higher Educational Systems, the production of scientific knowledge regarding dental education and training has been increasing. The National Curriculum Guidelines for undergraduate courses in Dentistry established in 2002 the principles and procedures to perform a more generalist dental professional profile. Objectives: To perform a systematic review of the Brazilian scientific literature about dental education and training. Methods: The systematic review was conducted considering the Lilacs - Latin American Literature in Health Sciences and SciELO - Scientific Electronic Library Online data bases, using the combination of key words dentistry, education, teaching or training. It was select original research articles, published between 2010 and 2013, in Portuguese. Results: Based on the selection criteria, it was found 23 articles. In order to organize the outcomes, the analysis was separated in three themes: Ethical aspects of education (3 articles), integrating dental service with training (10 articles) and Dental education and the Brazilian curriculum guidelines (10 articles). Most of the studies were published between 2011 and 2012 (35% each) and were held in public universities. The studied populations included dental students, teachers, universities directors, health managers and dentists. The qualitative methodological approach was predominant. Conclusion: It was possible to identify a transience time in Brazilian undergraduate courses in Dentistry after curricular changes. The produced literature shows some advances, as the incorporation of ethical values on dental education and the inclusion of new practices environments for students by integrating education and training in diversified dental services scenarios.Keywords: Teaching, Dental Students, Human resources in dentistry
Procedia PDF Downloads 53221197 The Significance of Picture Mining in the Fashion and Design as a New Research Method
Authors: Katsue Edo, Yu Hiroi
Abstract:
T Increasing attention has been paid to using pictures and photographs in research since the beginning of the 21th century in social sciences. Meanwhile we have been studying the usefulness of Picture mining, which is one of the new ways for a these picture using researches. Picture Mining is an explorative research analysis method that takes useful information from pictures, photographs and static or moving images. It is often compared with the methods of text mining. The Picture Mining concept includes observational research in the broad sense, because it also aims to analyze moving images (Ochihara and Edo 2013). In the recent literature, studies and reports using pictures are increasing due to the environmental changes. These are identified as technological and social changes (Edo et.al. 2013). Low price digital cameras and i-phones, high information transmission speed, low costs for information transferring and high performance and resolution of the cameras of mobile phones have changed the photographing behavior of people. Consequently, there is less resistance in taking and processing photographs for most of the people in the developing countries. In these studies, this method of collecting data from respondents is often called as ‘participant-generated photography’ or ‘respondent-generated visual imagery’, which focuses on the collection of data and its analysis (Pauwels 2011, Snyder 2012). But there are few systematical and conceptual studies that supports it significance of these methods. We have discussed in the recent years to conceptualize these picture using research methods and formalize theoretical findings (Edo et. al. 2014). We have identified the most efficient fields of Picture mining in the following areas inductively and in case studies; 1) Research in Consumer and Customer Lifestyles. 2) New Product Development. 3) Research in Fashion and Design. Though we have found that it will be useful in these fields and areas, we must verify these assumptions. In this study we will focus on the field of fashion and design, to determine whether picture mining methods are really reliable in this area. In order to do so we have conducted an empirical research of the respondents’ attitudes and behavior concerning pictures and photographs. We compared the attitudes and behavior of pictures toward fashion to meals, and found out that taking pictures of fashion is not as easy as taking meals and food. Respondents do not often take pictures of fashion and upload their pictures online, such as Facebook and Instagram, compared to meals and food because of the difficulty of taking them. We concluded that we should be more careful in analyzing pictures in the fashion area for there still might be some kind of bias existing even if the environment of pictures have drastically changed in these years.Keywords: empirical research, fashion and design, Picture Mining, qualitative research
Procedia PDF Downloads 36321196 Training of Future Computer Science Teachers Based on Machine Learning Methods
Authors: Meruert Serik, Nassipzhan Duisegaliyeva, Danara Tleumagambetova
Abstract:
The article highlights and describes the characteristic features of real-time face detection in images and videos using machine learning algorithms. Students of educational programs reviewed the research work "6B01511-Computer Science", "7M01511-Computer Science", "7M01525- STEM Education," and "8D01511-Computer Science" of Eurasian National University named after L.N. Gumilyov. As a result, the advantages and disadvantages of Haar Cascade (Haar Cascade OpenCV), HoG SVM (Histogram of Oriented Gradients, Support Vector Machine), and MMOD CNN Dlib (Max-Margin Object Detection, convolutional neural network) detectors used for face detection were determined. Dlib is a general-purpose cross-platform software library written in the programming language C++. It includes detectors used for determining face detection. The Cascade OpenCV algorithm is efficient for fast face detection. The considered work forms the basis for the development of machine learning methods by future computer science teachers.Keywords: algorithm, artificial intelligence, education, machine learning
Procedia PDF Downloads 7321195 In Vitro Studies on Antimicrobial Activities of Lactic Acid Bacteria Isolated from Fresh Fruits for Biocontrol of Pathogens
Authors: Okolie Pius Ifeanyi, Emerenini Emilymary Chima
Abstract:
Aims: The study investigated the diversity and identities of Lactic Acid Bacteria (LAB) isolated from different fresh fruits using Molecular Nested PCR analysis and the efficacy of cell free supernatants from Lactic Acid Bacteria (LAB) isolated from fresh fruits for in vitro control of some tomato pathogens. Study Design: Nested PCR approach was used in this study employing universal 16S rRNA gene primers in the first round PCR and LAB specific Primers in the second round PCR with the view of generating specific Nested PCR products for the LAB diversity present in the samples. The inhibitory potentials of supernatant obtained from LAB isolates of fruits origin that were molecularly characterized were investigated against some tomato phytopathogens using agar-well method with the view to develop biological agents for some tomato disease causing organisms. Methodology: Gram positive, catalase negative strains of LAB were isolated from fresh fruits on Man Rogosa and Sharpe agar (Lab M) using streaking method. Isolates obtained were molecularly characterized by means of genomic DNA extraction kit (Norgen Biotek, Canada) method. Standard methods were used for Nested Polymerase Chain Reaction (PCR) amplification targeting the 16S rRNA gene using universal 16S rRNA gene and LAB specific primers, agarose gel electrophoresis, purification and sequencing of generated Nested PCR products (Macrogen Inc., USA). The partial sequences obtained were identified by blasting in the non-redundant nucleotide database of National Center for Biotechnology Information (NCBI). The antimicrobial activities of characterized LAB against some tomato phytopathogenic bacteria which include (Xanthomonas campestries, Erwinia caratovora, and Pseudomonas syringae) were obtained by using the agar well diffusion method. Results: The partial sequences obtained were deposited in the database of National Centre for Biotechnology Information (NCBI). Isolates were identified based upon the sequences as Weissella cibaria (4, 18.18%), Weissella confusa (3, 13.64%), Leuconostoc paramensenteroides (1, 4.55%), Lactobacillus plantarum (8, 36.36%), Lactobacillus paraplantarum (1, 4.55%) and Lactobacillus pentosus (1, 4.55%). The cell free supernatants of LAB from fresh fruits origin (Weissella cibaria, Weissella confusa, Leuconostoc paramensenteroides, Lactobacillus plantarum, Lactobacillus paraplantarum and Lactobacillus pentosus) can inhibits these bacteria by creating clear zones of inhibition around the wells containing cell free supernatants of the above mentioned strains of lactic acid bacteria. Conclusion: This study shows that potentially LAB can be quickly characterized by molecular methods to specie level by nested PCR analysis of the bacteria isolate genomic DNA using universal 16S rRNA primers and LAB specific primer. Tomato disease causing organisms can be most likely biologically controlled by using extracts from LAB. This finding will reduce the potential hazard from the use of chemical herbicides on plant.Keywords: nested pcr, molecular characterization, 16s rRNA gene, lactic acid bacteria
Procedia PDF Downloads 41421194 A Collaborative Learning Model in Engineering Science Based on a Cyber-Physical Production Line
Authors: Yosr Ghozzi
Abstract:
The Cyber-Physical Systems terminology has been well received by the industrial community and specifically appropriated in educational settings. Indeed, our latest educational activities are based on the development of experimental platforms on an industrial scale. In fact, we built a collaborative learning model because of an international market study that led us to place ourselves at the heart of this technology. To align with these findings, a competency-based approach study was conducted, and program content was revised by reflecting the projectbased approach. Thus, this article deals with the development of educational devices according to a generated curriculum and specific educational activities while respecting the repository of skills adopted from what constitutes the educational cyber-physical production systems and the laboratories that are compliant and adapted to them. The implementation of these platforms was systematically carried out in the school's workshops spaces. The objective has been twofold, both research and teaching for the students in mechatronics and logistics of the electromechanical department. We act as trainers and industrial experts to involve students in the implementation of possible extension systems around multidisciplinary projects and reconnect with industrial projects for better professional integration.Keywords: education 4.0, competency-based learning, teaching factory, project-based learning, cyber-physical systems, industry 4.0
Procedia PDF Downloads 107