Search results for: remote procedure call
3073 Experimental Study of Impregnated Diamond Bit Wear During Sharpening
Authors: Rui Huang, Thomas Richard, Masood Mostofi
Abstract:
The lifetime of impregnated diamond bits and their drilling efficiency are in part governed by the bit wear conditions, not only the extent of the diamonds’ wear but also their exposure or protrusion out of the matrix bonding. As much as individual diamonds wear, the bonding matrix does also wear through two-body abrasion (direct matrix-rock contact) and three-body erosion (cuttings trapped in the space between rock and matrix). Although there is some work dedicated to the study of diamond bit wear, there is still a lack of understanding on how matrix erosion and diamond exposure relate to the bit drilling response and drilling efficiency, as well as no literature on the process that governs bit sharpening a procedure commonly implemented by drillers when the extent of diamond polishing yield extremely low rate of penetration. The aim of this research is (i) to derive a correlation between the wear state of the bit and the drilling performance but also (ii) to gain a better understanding of the process associated with tool sharpening. The research effort combines specific drilling experiments and precise mapping of the tool-cutting face (impregnated diamond bits and segments). Bit wear is produced by drilling through a rock sample at a fixed rate of penetration for a given period of time. Before and after each wear test, the bit drilling response and thus efficiency is mapped out using a tailored design experimental protocol. After each drilling test, the bit or segment cutting face is scanned with an optical microscope. The test results show that, under the fixed rate of penetration, diamond exposure increases with drilling distance but at a decreasing rate, up to a threshold exposure that corresponds to the optimum drilling condition for this feed rate. The data further shows that the threshold exposure scale with the rate of penetration up to a point where exposure reaches a maximum beyond which no more matrix can be eroded under normal drilling conditions. The second phase of this research focuses on the wear process referred as bit sharpening. Drillers rely on different approaches (increase feed rate or decrease flow rate) with the aim of tearing worn diamonds away from the bit matrix, wearing out some of the matrix, and thus exposing fresh sharp diamonds and recovering a higher rate of penetration. Although a common procedure, there is no rigorous methodology to sharpen the bit and avoid excessive wear or bit damage. This paper aims to gain some insight into the mechanisms that accompany bit sharpening by carefully tracking diamond fracturing, matrix wear, and erosion and how they relate to drilling parameters recorded while sharpening the tool. The results show that there exist optimal conditions (operating parameters and duration of the procedure) for sharpening that minimize overall bit wear and that the extent of bit sharpening can be monitored in real-time.Keywords: bit sharpening, diamond exposure, drilling response, impregnated diamond bit, matrix erosion, wear rate
Procedia PDF Downloads 993072 A Study of Influence of Freezing on Mechanical Properties of Tendon Fascicles
Authors: Martyna Ekiert, Andrzej Mlyniec
Abstract:
Tendons are the biological structures, which primary function is to transfer force generated by muscles to the bones. Unfortunately, damages of tendons are also one of the most common injuries of the human musculoskeletal system. For the most severe cases of tendon rupture, such as the tear of calcaneus tendon or anterior cruciate ligament of the knee, a surgical procedure is the only possible way of full recovery. Tendons used as biological grafts are usually subjected to the process of deep freezing and subsequent thawing. This, in particular for multiple freezing/thawing cycles, may result in changes of tendon internal structure causing deterioration of mechanical properties of the tissue. Therefore, studies on the influence of freezing on tendons biomechanics, including internal water content in soft tissue, seems to be greatly needed. An experimental study of the influence of freezing on mechanical properties of the tendon was performed on fascicles samples dissected form bovine flexor tendons. The preparation procedure was performed with the presence of 0.9% saline solution in order to prevent an excessive tissue drying. All prepared samples were subjected to the different number of freezing/thawing cycles. For freezing part of the protocol we used -80°C temperature while for slow thawing we used fridge temperature (4°C) combined with equalizing temperatures in the standard state (25°C). After final thawing, the mechanical properties of each sample was examined using cyclic loading test. Our results may contribute for better understanding of negative effects of soft tissues freezing, resulting from abnormal thermal expansion of water. This also may help to determine the limit of freezing/thawing cycles disqualifying tissue for surgical purposes and thus help optimize tissues storage conditions.Keywords: freezing, soft tissue, tendon, bovine fascicles
Procedia PDF Downloads 2193071 Source Separation for Global Multispectral Satellite Images Indexing
Authors: Aymen Bouzid, Jihen Ben Smida
Abstract:
In this paper, we propose to prove the importance of the application of blind source separation methods on remote sensing data in order to index multispectral images. The proposed method starts with Gabor Filtering and the application of a Blind Source Separation to get a more effective representation of the information contained on the observation images. After that, a feature vector is extracted from each image in order to index them. Experimental results show the superior performance of this approach.Keywords: blind source separation, content based image retrieval, feature extraction multispectral, satellite images
Procedia PDF Downloads 4033070 Using the Cluster Computing to Improve the Computational Speed of the Modular Exponentiation in RSA Cryptography System
Authors: Te-Jen Chang, Ping-Sheng Huang, Shan-Ten Cheng, Chih-Lin Lin, I-Hui Pan, Tsung- Hsien Lin
Abstract:
RSA system is a great contribution for the encryption and the decryption. It is based on the modular exponentiation. We call this system as “a large of numbers for calculation”. The operation of a large of numbers is a very heavy burden for CPU. For increasing the computational speed, in addition to improve these algorithms, such as the binary method, the sliding window method, the addition chain method, and so on, the cluster computer can be used to advance computational speed. The cluster system is composed of the computers which are installed the MPICH2 in laboratory. The parallel procedures of the modular exponentiation can be processed by combining the sliding window method with the addition chain method. It will significantly reduce the computational time of the modular exponentiation whose digits are more than 512 bits and even more than 1024 bits.Keywords: cluster system, modular exponentiation, sliding window, addition chain
Procedia PDF Downloads 5223069 Remote Sensing Application in Environmental Researches: Case Study of Iran Mangrove Forests Quantitative Assessment
Authors: Neda Orak, Mostafa Zarei
Abstract:
Environmental assessment is an important session in environment management. Since various methods and techniques have been produces and implemented. Remote sensing (RS) is widely used in many scientific and research fields such as geology, cartography, geography, agriculture, forestry, land use planning, environment, etc. It can show earth surface objects cyclical changes. Also, it can show earth phenomena limits on basis of electromagnetic reflectance changes and deviations records. The research has been done on mangrove forests assessment by RS techniques. Mangrove forests quantitative analysis in Basatin and Bidkhoon estuaries was the aim of this research. It has been done by Landsat satellite images from 1975- 2013 and match to ground control points. This part of mangroves are the last distribution in northern hemisphere. It can provide a good background to improve better management on this important ecosystem. Landsat has provided valuable images to earth changes detection to researchers. This research has used MSS, TM, +ETM, OLI sensors from 1975, 1990, 2000, 2003-2013. Changes had been studied after essential corrections such as fix errors, bands combination, georeferencing on 2012 images as basic image, by maximum likelihood and IPVI Index. It was done by supervised classification. 2004 google earth image and ground points by GPS (2010-2012) was used to compare satellite images obtained changes. Results showed mangrove area in bidkhoon was 1119072 m2 by GPS and 1231200 m2 by maximum likelihood supervised classification and 1317600 m2 by IPVI in 2012. Basatin areas is respectively: 466644 m2, 88200 m2, 63000 m2. Final results show forests have been declined naturally. It is due to human activities in Basatin. The defect was offset by planting in many years. Although the trend has been declining in recent years again. So, it mentioned satellite images have high ability to estimation all environmental processes. This research showed high correlation between images and indexes such as IPVI and NDVI with ground control points.Keywords: IPVI index, Landsat sensor, maximum likelihood supervised classification, Nayband National Park
Procedia PDF Downloads 2933068 In Situ Volume Imaging of Cleared Mice Seminiferous Tubules Opens New Window to Study Spermatogenic Process in 3D
Authors: Lukas Ded
Abstract:
Studying the tissue structure and histogenesis in the natural, 3D context is challenging but highly beneficial process. Contrary to classical approach of the physical tissue sectioning and subsequent imaging, it enables to study the relationships of individual cellular and histological structures in their native context. Recent developments in the tissue clearing approaches and microscopic volume imaging/data processing enable the application of these methods also in the areas of developmental and reproductive biology. Here, using the CLARITY tissue procedure and 3D confocal volume imaging we optimized the protocol for clearing, staining and imaging of the mice seminiferous tubules isolated from the testes without cardiac perfusion procedure. Our approach enables the high magnification and fine resolution axial imaging of the whole diameter of the seminiferous tubules with possible unlimited lateral length imaging. Hence, the large continuous pieces of the seminiferous tubule can be scanned and digitally reconstructed for the study of the single tubule seminiferous stages using nuclear dyes. Furthermore, the application of the antibodies and various molecular dyes can be used for molecular labeling of individual cellular and subcellular structures and resulting 3D images can highly increase our understanding of the spatiotemporal aspects of the seminiferous tubules development and sperm ultrastructure formation. Finally, our newly developed algorithms for 3D data processing enable the massive parallel processing of the large amount of individual cell and tissue fluorescent signatures and building the robust spermatogenic models under physiological and pathological conditions.Keywords: CLARITY, spermatogenesis, testis, tissue clearing, volume imaging
Procedia PDF Downloads 1363067 Characterisation of Pasteurella multocida from Asymptomatic Animals
Authors: Rajeev Manhas, M. A. Bhat, A. K. Taku, Dalip Singh, Deep Shikha, Gulzar Bader
Abstract:
The study was aimed to understand the distribution of various serogroups of Pasteurella multocida in bovines, small ruminants, pig, rabbit, and poultry from Jammu, Jammu and Kashmir and to characterize the isolates with respect to LPS synthesizing genes, dermonecrotic toxin gene (toxA) gene and antibiotic resistance. For isolation, the nasopharyngeal swab procedure appeared to be better than nasal swab procedure, particularly in ovine and swine. Out of 200 samples from different animals, isolation of P. multocida could be achieved from pig and sheep (5 each) and from poultry and buffalo (2 each) samples only, which accounted for 14 isolates. Upon molecular serogrouping, 3 isolates from sheep and 2 isolates from poultry were found as serogroup A, 2 isolates from buffalo were confirmed as serogroup B and 5 isolates from pig were found to belong to serogroup D. However, 2 isolates from sheep could not be typed, hence, untypable. All the 14 isolates were subjected to mPCR genotyping. A total of 10 isolates, 5 each from pig and sheep, generated an amplicon specific to genotype L6 and L6 indicates Heddleston serovars 10, 11, 12 and 15. Similarly, 2 isolates from bovines generated an amplicon of genotype L2 which indicates Heddleston serovar 2/5. However, 2 isolates from poultry generated specific amplicon with L1 signifying Heddleston serovar 1, but these isolates also produced multiple bands with primer L5. Only, one isolate of capsular type A from sheep possessed the structural gene, toxA for dermonecrotoxin. There was variability in the antimicrobial susceptibility pattern in sheep isolates, but overall the rate of tetracycline resistance was relatively high (64.28%) in our strains while all the isolates were sensitive to streptomycin. Except for the swine isolates and one toxigenic sheep isolate, the P. multocida isolates from this study were sensitive to quinolones. Although the level of resistance to commercial antibiotics was generally low, the use of tetracycline and erythromycin was not recommended.Keywords: antibiogram, genotyping, Pasteurella multocida, serogrouping, toxA
Procedia PDF Downloads 4533066 An Artificial Neural Network Model Based Study of Seismic Wave
Authors: Hemant Kumar, Nilendu Das
Abstract:
A study based on ANN structure gives us the information to predict the size of the future in realizing a past event. ANN, IMD (Indian meteorological department) data and remote sensing were used to enable a number of parameters for calculating the size that may occur in the future. A threshold selected specifically above the high-frequency harvest reached the area during the selected seismic activity. In the field of human and local biodiversity it remains to obtain the right parameter compared to the frequency of impact. But during the study the assumption is that predicting seismic activity is a difficult process, not because of the parameters involved here, which can be analyzed and funded in research activity.Keywords: ANN, Bayesion class, earthquakes, IMD
Procedia PDF Downloads 1253065 Pipeline Construction in Oil and Gas Fields as per Kuwait Oil Company Procedures
Authors: Jasem Al-Safran
Abstract:
Nowadays Oil and Gas industry considered as one of the biggest industries around the world although it caused a lot of pollution to the world and it caused many damages to the mankind and the other creatures around the globe it still one of the biggest industries, it create millions of careers around the globe which reduced the poorness level and make the mankind life’s much more comfortable you may compare the humans life before the exploration of the oil and after the oil industries development. Construction project’s consist of 3 major sections also we call them EPC projects the first section is the detailed engineering, the second section is the procurements section and finally is the Construction section, each section required a specialized work force with a different skills in order to handle the work load for example in the oil sector and depending on the nature of the project and the project size the Construction team required mechanical engineer, civil engineer, electrical engineer and instrumentation engineer, also a work site supervisor for each disciplines also a huge number of labors, technicians and many equipment’s.Keywords: Construction, EPC, Project, Work force
Procedia PDF Downloads 1063064 Development of a Fire Analysis Drone for Smoke Toxicity Measurement for Fire Prediction and Management
Authors: Gabrielle Peck, Ryan Hayes
Abstract:
This research presents the design and creation of a drone gas analyser, aimed at addressing the need for independent data collection and analysis of gas emissions during large-scale fires, particularly wasteland fires. The analyser drone, comprising a lightweight gas analysis system attached to a remote-controlled drone, enables the real-time assessment of smoke toxicity and the monitoring of gases released into the atmosphere during such incidents. The key components of the analyser unit included two gas line inlets connected to glass wool filters, a pump with regulated flow controlled by a mass flow controller, and electrochemical cells for detecting nitrogen oxides, hydrogen cyanide, and oxygen levels. Additionally, a non-dispersive infrared (NDIR) analyser is employed to monitor carbon monoxide (CO), carbon dioxide (CO₂), and hydrocarbon concentrations. Thermocouples can be attached to the analyser to monitor temperature, as well as McCaffrey probes combined with pressure transducers to monitor air velocity and wind direction. These additions allow for monitoring of the large fire and can be used for predictions of fire spread. The innovative system not only provides crucial data for assessing smoke toxicity but also contributes to fire prediction and management. The remote-controlled drone's mobility allows for safe and efficient data collection in proximity to the fire source, reducing the need for human exposure to hazardous conditions. The data obtained from the gas analyser unit facilitates informed decision-making by emergency responders, aiding in the protection of both human health and the environment. This abstract highlights the successful development of a drone gas analyser, illustrating its potential for enhancing smoke toxicity analysis and fire prediction capabilities. The integration of this technology into fire management strategies offers a promising solution for addressing the challenges associated with wildfires and other large-scale fire incidents. The project's methodology and results contribute to the growing body of knowledge in the field of environmental monitoring and safety, emphasizing the practical utility of drones for critical applications.Keywords: fire prediction, drone, smoke toxicity, analyser, fire management
Procedia PDF Downloads 893063 Drawing Building Blocks in Existing Neighborhoods: An Automated Pilot Tool for an Initial Approach Using GIS and Python
Authors: Konstantinos Pikos, Dimitrios Kaimaris
Abstract:
Although designing building blocks is a procedure used by many planners around the world, there isn’t an automated tool that will help planners and designers achieve their goals with lesser effort. The difficulty of the subject lies in the repeating process of manually drawing lines, while not only it is mandatory to maintain the desirable offset but to also achieve a lesser impact to the existing building stock. In this paper, using Geographical Information Systems (GIS) and the Python programming language, an automated tool integrated into ArcGIS PRO, is being presented. Despite its simplistic enviroment and the lack of specialized building legislation due to the complex state of the field, a planner who is aware of such technical information can use the tool to draw an initial approach of the final building blocks in an area with pre-existing buildings in an attempt to organize the usually sprawling suburbs of a city or any continuously developing area. The tool uses ESRI’s ArcPy library to handle the spatial data, while interactions with the user is made throught Tkinter. The main process consists of a modification of building edgescoordinates, using NumPy library, in an effort to draw the line of best fit, so the user can get the optimal results per block’s side. Finally, after the tool runs successfully, a table of primary planning information is shown, such as the area of the building block and its coverage rate. Regardless of the primary stage of the tool’s development, it is a solid base where potential planners with programming skills could invest, so they can make the tool adapt to their individual needs. An example of the entire procedure in a test area is provided, highlighting both the strengths and weaknesses of the final results.Keywords: arcPy, GIS, python, building blocks
Procedia PDF Downloads 1793062 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring
Authors: Zheng Wang, Zhenhong Li, Jon Mills
Abstract:
Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring
Procedia PDF Downloads 1613061 Its about Cortana, Microsoft’s Virtual Assistant
Authors: Aya Idriss, Esraa Othman, Lujain Malak
Abstract:
Artificial intelligence is the emulation of human intelligence processes by machines, particularly computer systems that act logically. Some of the specific applications of AI include natural language processing, speech recognition, and machine vision. Cortana is a virtual assistant and she’s an example of an AI Application. Microsoft made it possible for this app to be accessed not only on laptops and PCs but can be downloaded on mobile phones and used as a virtual assistant which was a huge success. Cortana can offer a lot apart from the basic orders such as setting alarms and marking the calendar. Its capabilities spread past that, for example, it provides us with listening to music and podcasts on the go, managing my to-do list and emails, connecting with my contacts hands-free by simply just telling the virtual assistant to call somebody, gives me instant answers and so on. A questionnaire was sent online to numerous friends and family members to perform the study, which is critical in evaluating Cortana's recognition capacity and the majority of the answers were in favor of Cortana’s capabilities. The results of the questionnaire assisted us in determining the level of Cortana's skills.Keywords: artificial intelligence, Cortana, AI, abstract
Procedia PDF Downloads 1753060 Landsat Data from Pre Crop Season to Estimate the Area to Be Planted with Summer Crops
Authors: Valdir Moura, Raniele dos Anjos de Souza, Fernando Gomes de Souza, Jose Vagner da Silva, Jerry Adriani Johann
Abstract:
The estimate of the Area of Land to be planted with annual crops and its stratification by the municipality are important variables in crop forecast. Nowadays in Brazil, these information’s are obtained by the Brazilian Institute of Geography and Statistics (IBGE) and published under the report Assessment of the Agricultural Production. Due to the high cloud cover in the main crop growing season (October to March) it is difficult to acquire good orbital images. Thus, one alternative is to work with remote sensing data from dates before the crop growing season. This work presents the use of multitemporal Landsat data gathered on July and September (before the summer growing season) in order to estimate the area of land to be planted with summer crops in an area of São Paulo State, Brazil. Geographic Information Systems (GIS) and digital image processing techniques were applied for the treatment of the available data. Supervised and non-supervised classifications were used for data in digital number and reflectance formats and the multitemporal Normalized Difference Vegetation Index (NDVI) images. The objective was to discriminate the tracts with higher probability to become planted with summer crops. Classification accuracies were evaluated using a sampling system developed basically for this study region. The estimated areas were corrected using the error matrix derived from these evaluations. The classification techniques presented an excellent level according to the kappa index. The proportion of crops stratified by municipalities was derived by a field work during the crop growing season. These proportion coefficients were applied onto the area of land to be planted with summer crops (derived from Landsat data). Thus, it was possible to derive the area of each summer crop by the municipality. The discrepancies between official statistics and our results were attributed to the sampling and the stratification procedures. Nevertheless, this methodology can be improved in order to provide good crop area estimates using remote sensing data, despite the cloud cover during the growing season.Keywords: area intended for summer culture, estimated area planted, agriculture, Landsat, planting schedule
Procedia PDF Downloads 1503059 Design of a Remote Radiation Sensing Module Based on Portable Gamma Spectrometer
Authors: Young Gil Kim, Hye Min Park, Chan Jong Park, Koan Sik Joo
Abstract:
A personal gamma spectrometer has to be sensitive, pocket-sized, and carriable on the users. To serve these requirements, we developed the SiPM-based portable radiation detectors. The prototype uses a Ce:GAGG scintillator coupled to a silicon photomultiplier and a radio frequency(RF) module to measure gamma-ray, and can be accessed wirelessly or remotely by mobile equipment. The prototype device consumes roughly 4.4W, weighs about 180g (including battery), and measures 5.0 7.0. It is able to achieve 5.8% FWHM energy resolution at 662keV.Keywords: Ce:GAGG, gamma-ray, radio frequency, silicon photomultiplier
Procedia PDF Downloads 3323058 Objects Tracking in Catadioptric Images Using Spherical Snake
Authors: Khald Anisse, Amina Radgui, Mohammed Rziza
Abstract:
Tracking objects on video sequences is a very challenging task in many works in computer vision applications. However, there is no article that treats this topic in catadioptric vision. This paper is an attempt that tries to describe a new approach of omnidirectional images processing based on inverse stereographic projection in the half-sphere. We used the spherical model proposed by Gayer and al. For object tracking, our work is based on snake method, with optimization using the Greedy algorithm, by adapting its different operators. The algorithm will respect the deformed geometries of omnidirectional images such as spherical neighborhood, spherical gradient and reformulation of optimization algorithm on the spherical domain. This tracking method that we call "spherical snake" permitted to know the change of the shape and the size of object in different replacements in the spherical image.Keywords: computer vision, spherical snake, omnidirectional image, object tracking, inverse stereographic projection
Procedia PDF Downloads 4023057 Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R
Authors: Jaya Mathew
Abstract:
Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.Keywords: predictive maintenance, machine learning, big data, cloud based, on premise solution, R
Procedia PDF Downloads 3783056 Incidence and Risk Factors of Traumatic Lumbar Puncture in Newborns in a Tertiary Care Hospital
Authors: Heena Dabas, Anju Paul, Suman Chaurasia, Ramesh Agarwal, M. Jeeva Sankar, Anurag Bajpai, Manju Saksena
Abstract:
Background: Traumatic lumbar puncture (LP) is a common occurrence and causes substantial diagnostic ambiguity. There is paucity of data regarding its epidemiology. Objective: To assess the incidence and risk factors of traumatic LP in newborns. Design/Methods: In a prospective cohort study, all inborn neonates admitted in NICU and planned to undergo LP for a clinical indication of sepsis were included. Neonates with diagnosed intraventricular hemorrhage (IVH) of grade III and IV were excluded. The LP was done by operator - often a fellow or resident assisted by bedside nurse. The unit has policy of not routinely using any sedation/analgesia during the procedure. LP is done by 26 G and 0.5-inch-long hypodermic needle inserted in third or fourth lumbar space while the infant is in lateral position. The infants were monitored clinically and by continuous measurement of vital parameters using multipara monitor during the procedure. The occurrence of traumatic tap along with CSF parameters and other operator and assistant characteristics were recorded at the time of procedure. Traumatic tap was defined as presence of visible blood or more than 500 red blood cells on microscopic examination. Microscopic trauma was defined when CSF is not having visible blood but numerous RBCs. The institutional ethics committee approved the study protocol. A written informed consent from the parents and the health care providers involved was obtained. Neonates were followed up till discharge/death and final diagnosis was assigned along with treating team. Results: A total of 362 (21%) neonates out of 1726 born at the hospital were admitted during the study period (July 2016 to January, 2017). Among these neonates, 97 (26.7%) were suspected of sepsis. A total of 54 neonates were enrolled who met the eligibility criteria and parents consented to participate in the study. The mean (SD) birthweight was 1536 (732) grams and gestational age 32.0 (4.0) weeks. All LPs were indicated for late onset sepsis at the median (IQR) age of 12 (5-39) days. The traumatic LP occurred in 19 neonates (35.1%; 95% C.I 22.6% to 49.3%). Frank blood was observed in 7 (36.8%) and in the remaining, 12(63.1%) CSF was detected to have microscopic trauma. The preliminary risk factor analysis including birth weight, gestational age and operator/assistant and other characteristics did not demonstrate clinically relevant predictors. Conclusion: A significant number of neonates requiring lumbar puncture in our study had high incidence of traumatic tap. We were not able to identify modifiable risk factors. There is a need to understand the reasons and further reduce this issue for improving management in NICUs.Keywords: incidence, newborn, traumatic, lumbar puncture
Procedia PDF Downloads 2973055 The Influence of Leader’s Sources of Power on Organizational Citizenship Behaviour
Authors: Noor Azlina Mohamed Yunus, Noorlaila Yunus, Kadulliah Ghazali
Abstract:
In this an era of intense competition, Malaysia aspires to be a fully developed country by 2020 and desires its citizens to perform and execute excellent work behaviors. For that reason, organizations are focusing on employees’ positive and constructive behaviors such as organizational citizenship behavior (OCB). They expect employees to not only complete their required duties by providing excellent performance but also keenly go beyond their roles that are not specifying in their formal job descriptions to ensure organizational success. The role and duty to acquire employees to engage and connect in OCB is the responsibility of a leader. Thus, leaders can utilize their sources of power to enable subordinates to accomplish organizational objective including OCB. Therefore, this paper formulates a framework postulating leader’s sources of power as an antecedent of organizational citizenship behavior (OCB). The discussion on implications for future theory development is discussed.Keywords: organizational citizenship behaviour (OCB), leader’s sources of power, call centre industry, conceptual paper
Procedia PDF Downloads 3213054 In-Flight Radiometric Performances Analysis of an Airborne Optical Payload
Authors: Caixia Gao, Chuanrong Li, Lingli Tang, Lingling Ma, Yaokai Liu, Xinhong Wang, Yongsheng Zhou
Abstract:
Performances analysis of remote sensing sensor is required to pursue a range of scientific research and application objectives. Laboratory analysis of any remote sensing instrument is essential, but not sufficient to establish a valid inflight one. In this study, with the aid of the in situ measurements and corresponding image of three-gray scale permanent artificial target, the in-flight radiometric performances analyses (in-flight radiometric calibration, dynamic range and response linearity, signal-noise-ratio (SNR), radiometric resolution) of self-developed short-wave infrared (SWIR) camera are performed. To acquire the inflight calibration coefficients of the SWIR camera, the at-sensor radiances (Li) for the artificial targets are firstly simulated with in situ measurements (atmosphere parameter and spectral reflectance of the target) and viewing geometries using MODTRAN model. With these radiances and the corresponding digital numbers (DN) in the image, a straight line with a formulation of L = G × DN + B is fitted by a minimization regression method, and the fitted coefficients, G and B, are inflight calibration coefficients. And then the high point (LH) and the low point (LL) of dynamic range can be described as LH= (G × DNH + B) and LL= B, respectively, where DNH is equal to 2n − 1 (n is the quantization number of the payload). Meanwhile, the sensor’s response linearity (δ) is described as the correlation coefficient of the regressed line. The results show that the calibration coefficients (G and B) are 0.0083 W·sr−1m−2µm−1 and −3.5 W·sr−1m−2µm−1; the low point of dynamic range is −3.5 W·sr−1m−2µm−1 and the high point is 30.5 W·sr−1m−2µm−1; the response linearity is approximately 99%. Furthermore, a SNR normalization method is used to assess the sensor’s SNR, and the normalized SNR is about 59.6 when the mean value of radiance is equal to 11.0 W·sr−1m−2µm−1; subsequently, the radiometric resolution is calculated about 0.1845 W•sr-1m-2μm-1. Moreover, in order to validate the result, a comparison of the measured radiance with a radiative-transfer-code-predicted over four portable artificial targets with reflectance of 20%, 30%, 40%, 50% respectively, is performed. It is noted that relative error for the calibration is within 6.6%.Keywords: calibration and validation site, SWIR camera, in-flight radiometric calibration, dynamic range, response linearity
Procedia PDF Downloads 2703053 Optimization of Cloud Classification Using Particle Swarm Algorithm
Authors: Riffi Mohammed Amine
Abstract:
A cloud is made up of small particles of liquid water or ice suspended in the atmosphere, which generally do not reach the ground. Various methods are used to classify clouds. This article focuses specifically on a technique known as particle swarm optimization (PSO), an AI approach inspired by the collective behaviors of animals living in groups, such as schools of fish and flocks of birds, and a method used to solve complex classification and optimization problems with approximate solutions. The proposed technique was evaluated using a series of second-generation METOSAT images taken by the MSG satellite. The acquired results indicate that the proposed method gave acceptable results.Keywords: remote sensing, particle swarm optimization, clouds, meteorological image
Procedia PDF Downloads 153052 Study and Experimental Analysis of a Photovoltaic Pumping System under Three Operating Modes
Authors: Rekioua D., Mohammedi A., Rekioua T., Mehleb Z.
Abstract:
Photovoltaic water pumping systems is considered as one of the most promising areas in photovoltaic applications, the economy and reliability of solar electric power made it an excellent choice for remote water pumping. Two conventional techniques are currently in use; the first is the directly coupled technique and the second is the battery buffered photovoltaic pumping system. In this paper, we present different performances of a three operation modes of photovoltaic pumping system. The aim of this work is to determine the effect of different parameters influencing the photovoltaic pumping system performances, such as pumping head, System configuration and climatic conditions. The obtained results are presented and discussed.Keywords: batteries charge mode, photovoltaic pumping system, pumping head, submersible pump
Procedia PDF Downloads 5093051 The Challenge of Graduate Unemployment in Nigeria: The Role of Entrepreneurship Education
Authors: Sunday Ose Ugadu
Abstract:
Unemployment, especially graduate unemployment is, for now, the greatest problem facing Nigeria as a nation. It is responsible for most of the other ills of the country, including kidnapping, armed robbery, youth restiveness, thuggery, to mention but a few. More and more people in Nigeria are now losing confidence in the prospect of tertiary education as an instrument par excellence for effecting national development. This paper, therefore, critically examined the problem of graduate unemployment in Nigeria. It briefly traced the history of university education in Nigeria. The rate and causes of graduate unemployment in Nigeria were also discussed. Previous attempts made by the government to solve the problem of unemployment were highlighted. The paper also harped on the prospect of entrepreneurship education as an instrument for fighting graduate unemployment identifying obstacles to entrepreneurship education in Nigeria. The paper drew conclusion, and major recommendation made was a call for converting the National Youth Service Corps Scheme in Nigeria to entrepreneurship and skills acquisition scheme as soon as possible.Keywords: graduate, unemployment, entrepreneurship education, national development
Procedia PDF Downloads 1893050 Transcending or Going beyond the Concept of Race
Authors: Ovett Nwosimiri
Abstract:
Historically the concept of race has played a significant part in the existence of African philosophy. Race, as part of the historical events, has been used as a reason for colonization. In recent years, there has been a numerous work on the concept of race. Some philosophers have devoted their time to the discourse of race and to understand the ascription of the race. These philosophers have dedicated their time and energy to the concept of race. Philosophers, like Joshua Glasgow, W. E. B. Du Bois, Lucius Outlaw, Kwame Anthony Appiah, Naomi Zack, Emmanuel C. Eze and many others took up the task to explain the concept of race, and also to explain in their view whether the concept of race should be conserved or eliminated. According to the eliminativists, the concept of race should be eliminated. According to the conservationists, the concept of race should be conserved. The aim of this paper is to look at the possibility of transcending the concept of race. In order to do this, the paper will briefly explain Joshua Glasgow’ idea theory of ‘racial reconstructionism’, and it will propose a theory of ‘racial transcendentalism’ as a way of transcending the concept of race. The paper will argue that we should see the concept of race as a concept that has a future beyond the mere meaning and ideas that call for its elimination or conservation.Keywords: conservationists, eliminativists, race, transcending
Procedia PDF Downloads 3533049 The Omani Learner of English Corpus: Source and Tools
Authors: Anood Al-Shibli
Abstract:
Designing a learner corpus is not an easy task to accomplish because dealing with learners’ language has many variables which might affect the results of any study based on learners’ language production (spoken and written). Also, it is very essential to systematically design a learner corpus especially when it is aimed to be a reference to language research. Therefore, designing the Omani Learner Corpus (OLEC) has undergone many explicit and systematic considerations. These criteria can be regarded as the foundation to design any learner corpus to be exploited effectively in language use and language learning studies. Added to that, OLEC is manually error-annotated corpus. Error-annotation in learner corpora is very essential; however, it is time-consuming and prone to errors. Consequently, a navigating tool is designed to help the annotators to insert errors’ codes in order to make the error-annotation process more efficient and consistent. To assure accuracy, error annotation procedure is followed to annotate OLEC and some preliminary findings are noted. One of the main results of this procedure is creating an error-annotation system based on the Omani learners of English language production. Because OLEC is still in the first stages, the primary findings are related to only one level of proficiency and one error type which is verb related errors. It is found that Omani learners in OLEC has the tendency to have more errors in forming the verb and followed by problems in agreement of verb. Comparing the results to other error-based studies indicate that the Omani learners tend to have basic verb errors which can found in lower-level of proficiency. To this end, it is essential to admit that examining learners’ errors can give insights to language acquisition and language learning and most errors do not happen randomly but they occur systematically among language learners.Keywords: error-annotation system, error-annotation manual, learner corpora, verbs related errors
Procedia PDF Downloads 1413048 High Fidelity Interactive Video Segmentation Using Tensor Decomposition, Boundary Loss, Convolutional Tessellations, and Context-Aware Skip Connections
Authors: Anthony D. Rhodes, Manan Goel
Abstract:
We provide a high fidelity deep learning algorithm (HyperSeg) for interactive video segmentation tasks using a dense convolutional network with context-aware skip connections and compressed, 'hypercolumn' image features combined with a convolutional tessellation procedure. In order to maintain high output fidelity, our model crucially processes and renders all image features in high resolution, without utilizing downsampling or pooling procedures. We maintain this consistent, high grade fidelity efficiently in our model chiefly through two means: (1) we use a statistically-principled, tensor decomposition procedure to modulate the number of hypercolumn features and (2) we render these features in their native resolution using a convolutional tessellation technique. For improved pixel-level segmentation results, we introduce a boundary loss function; for improved temporal coherence in video data, we include temporal image information in our model. Through experiments, we demonstrate the improved accuracy of our model against baseline models for interactive segmentation tasks using high resolution video data. We also introduce a benchmark video segmentation dataset, the VFX Segmentation Dataset, which contains over 27,046 high resolution video frames, including green screen and various composited scenes with corresponding, hand-crafted, pixel-level segmentations. Our work presents a improves state of the art segmentation fidelity with high resolution data and can be used across a broad range of application domains, including VFX pipelines and medical imaging disciplines.Keywords: computer vision, object segmentation, interactive segmentation, model compression
Procedia PDF Downloads 1203047 Acute Cartilage Defects of the Knee Treated With Chondral Restoration Procedures and Patellofemoral Stabilisation
Authors: John Scanlon, Antony Raymond, Randeep Aujla, Peter D’Alessandro, Satyen Gohil
Abstract:
Background: The incidence of significant acute chondral injuries with patella dislocation is around 10-15%. It is accepted that chondral procedures should only be performed in the presence of joint stability Methods:Patients were identified from surgeon/hospital logs. Patient demographics, lesion size and location, surgical procedure, patient reported outcome measures, post-operative MR imaging, and complications were recorded. PROMs and patient satisfaction was obtained. Results:20 knees (18 patients) were included. Mean age was 18.6 years (range; 11-39), and the mean follow-up was 16.6 months (range; 2-70). The defect locations were the lateral femoral condyle (9/20; 45%), patella (9/20; 45%), medial femoral condyle (1/20; 5%) and the trochlea (1/20; 5%). The mean defect size was 2.6cm2. Twelve knees were treated with cartilage fixation, 5 with microfracture, and 3 with OATS. At follow up, the overall mean Lysholm score was 77.4 (± 17.1), with no chondral regenerative procedure being statistically superior. There was no difference in Lysholm scores between those patients having acute medial patellofemoral ligament reconstruction versus medial soft tissue plication (p=0.59). Five (25%) knees required re-operation (one arthroscopic arthrolysis; one patella chondroplasty; two removal of loose bodies; one implant adjustment). Overall, 90% responded as being satisfied with surgery. Conclusion: Our aggressive pathway to identify and treat acute cartilage defects with early operative intervention and patella stabilisation has shown high rates of satisfaction and Lysholm scores. The full range of chondral restoration options should be considered by surgeons managing these patients.Keywords: patella dislocation, chondral restoration, knee, patella stabilisation
Procedia PDF Downloads 1283046 Change Detection of Water Bodies in Dhaka City: An Analysis Using Geographic Information System and Remote Sensing
Authors: M. Humayun Kabir, Mahamuda Afroze, K. Maudood Elahi
Abstract:
Since the late 1900s, unplanned and rapid urbanization processes have drastically altered the land, reduced water bodies, and decreased vegetation cover in the capital city of Bangladesh, Dhaka. The capitalist modes of urbanization results in the encroachment of the surface water bodies in this city. The main goal of this study is to investigate the change detection of water bodies in Dhaka city, analyzing spatial distribution of water bodies and calculating the changing rate of it. This effort aims to influence public policy for environmental justice initiatives around protecting water bodies for ensuring proper function of the urban ecosystem. This study accomplishes research goal by compiling satellite imageries into GIS software to understand the changes of water bodies in Dhaka city. The work focuses on the late 20th century to early 21st century to analyze this city before and after major infrastructural changes occurred in unplanned manner. The land use of the study area has been classified into four categories, and the areas of the different land use have been calculated using MS Excel and SPSS. The results reveal that the urbanization expanded from central to northern part and major encroachment occurred at the western and eastern part of the city. It has also been found that, in 1988, the total area of water bodies was 8935.38 hectares, and it gradually decreased, and in 1998, 2008, 2017, the total areas of water bodies reached 6065.73, 4853.32, 2077.56 hectares, respectively. Rapid population growth, unplanned urbanization, and industrialization have generated pressure to change the land use pattern in Dhaka city. These expansion processes are engulfing wetland, water bodies, and vegetation cover without considering environmental impact. In order to regain the wetland and surface water bodies, the concern authorities must implement laws and act as a legal instrument in this regard and take action against the violators of it. This research is the synthesis of time series data that provides a complete picture of the water body’s status of Dhaka city that might help to make plans and policies for water body conservation.Keywords: ecosystem, GIS, industrialization, land use, remote sensing, urbanization
Procedia PDF Downloads 1523045 Applying Laser Scanning and Digital Photogrammetry for Developing an Archaeological Model Structure for Old Castle in Germany
Authors: Bara' Al-Mistarehi
Abstract:
Documentation and assessment of conservation state of an archaeological structure is a significant procedure in any management plan. However, it has always been a challenge to apply this with a low coast and safe methodology. It is also a time-demanding procedure. Therefore, a low cost, efficient methodology for documenting the state of a structure is needed. In the scope of this research, this paper will employ digital photogrammetry and laser scanner to one of highly significant structures in Germany, The Old Castle (German: Altes Schloss). The site is well known for its unique features. However, the castle suffers from serious deterioration threats because of the environmental conditions and the absence of continuous monitoring, maintenance and repair plans. Digital photogrammetry is a generally accepted technique for the collection of 3D representations of the environment. For this reason, this image-based technique has been extensively used to produce high quality 3D models of heritage sites and historical buildings for documentation and presentation purposes. Additionally, terrestrial laser scanners are used, which directly measure 3D surface coordinates based on the run-time of reflected light pulses. These systems feature high data acquisition rates, good accuracy and high spatial data density. Despite the potential of each single approach, in this research work maximum benefit is to be expected by a combination of data from both digital cameras and terrestrial laser scanners. Within the paper, the usage, application and advantages of the technique will be investigated in terms of building high realistic 3D textured model for some parts of the old castle. The model will be used as diagnosing tool of the conservation state of the castle and monitoring mean for future changes.Keywords: Digital photogrammetry, Terrestrial laser scanners, 3D textured model, archaeological structure
Procedia PDF Downloads 1783044 Land Cover Mapping Using Sentinel-2, Landsat-8 Satellite Images, and Google Earth Engine: A Study Case of the Beterou Catchment
Authors: Ella Sèdé Maforikan
Abstract:
Accurate land cover mapping is essential for effective environmental monitoring and natural resources management. This study focuses on assessing the classification performance of two satellite datasets and evaluating the impact of different input feature combinations on classification accuracy in the Beterou catchment, situated in the northern part of Benin. Landsat-8 and Sentinel-2 images from June 1, 2020, to March 31, 2021, were utilized. Employing the Random Forest (RF) algorithm on Google Earth Engine (GEE), a supervised classification categorized the land into five classes: forest, savannas, cropland, settlement, and water bodies. GEE was chosen due to its high-performance computing capabilities, mitigating computational burdens associated with traditional land cover classification methods. By eliminating the need for individual satellite image downloads and providing access to an extensive archive of remote sensing data, GEE facilitated efficient model training on remote sensing data. The study achieved commendable overall accuracy (OA), ranging from 84% to 85%, even without incorporating spectral indices and terrain metrics into the model. Notably, the inclusion of additional input sources, specifically terrain features like slope and elevation, enhanced classification accuracy. The highest accuracy was achieved with Sentinel-2 (OA = 91%, Kappa = 0.88), slightly surpassing Landsat-8 (OA = 90%, Kappa = 0.87). This underscores the significance of combining diverse input sources for optimal accuracy in land cover mapping. The methodology presented herein not only enables the creation of precise, expeditious land cover maps but also demonstrates the prowess of cloud computing through GEE for large-scale land cover mapping with remarkable accuracy. The study emphasizes the synergy of different input sources to achieve superior accuracy. As a future recommendation, the application of Light Detection and Ranging (LiDAR) technology is proposed to enhance vegetation type differentiation in the Beterou catchment. Additionally, a cross-comparison between Sentinel-2 and Landsat-8 for assessing long-term land cover changes is suggested.Keywords: land cover mapping, Google Earth Engine, random forest, Beterou catchment
Procedia PDF Downloads 63