Search results for: data reduction
28154 Porosity Characterization and Its Destruction by Authigenic Minerals: Reservoir Sandstones, Mamuniyat Formation, Murzuq Basin, SW Libya
Authors: Mohamrd Ali Alrabib
Abstract:
Sandstones samples were selected from cores of seven wells ranging in depth from 5040 to 7181.4 ft. The dominant authigenic cement phase is quartz overgrowth cement (up to 13% by volume) and this is the major mechanism for porosity reduction. Late stage carbonate cements (siderite and dolomite/ferroan dolomite) are present and these minerals infill intergranular porosity and, therefore, further reduce porosity and probably permeability. Authigenic clay minerals are represented by kaolinite, illite, and grain coating clay minerals. Kaolinite occurs as booklet and vermicular forms. Minor amounts of illite were noted in the studied samples, which commonly block pore throats, thereby reducing permeability. Primary porosity of up to 26.5% is present. Secondary porosity (up to 17%) is also present as a result of feldspar dissolution. The high intergranular volume (IGV) of the sandstones indicates that mechanical and chemical compaction played a more important role than cementation of porosity loss.Keywords: authigenic minerals, porosity types, porosity reduction, mamuniyat sandstone reservoir
Procedia PDF Downloads 37828153 Iterative Reconstruction Techniques as a Dose Reduction Tool in Pediatric Computed Tomography Imaging: A Phantom Study
Authors: Ajit Brindhaban
Abstract:
Background and Purpose: Computed Tomography (CT) scans have become the largest source of radiation in radiological imaging. The purpose of this study was to compare the quality of pediatric Computed Tomography (CT) images reconstructed using Filtered Back Projection (FBP) with images reconstructed using different strengths of Iterative Reconstruction (IR) technique, and to perform a feasibility study to assess the use of IR techniques as a dose reduction tool. Materials and Methods: An anthropomorphic phantom representing a 5-year old child was scanned, in two stages, using a Siemens Somatom CT unit. In stage one, scans of the head, chest and abdomen were performed using standard protocols recommended by the scanner manufacturer. Images were reconstructed using FBP and 5 different strengths of IR. Contrast-to-Noise Ratios (CNR) were calculated from average CT number and its standard deviation measured in regions of interest created in the lungs, bone, and soft tissues regions of the phantom. Paired t-test and the one-way ANOVA were used to compare the CNR from FBP images with IR images, at p = 0.05 level. The lowest strength value of IR that produced the highest CNR was identified. In the second stage, scans of the head was performed with decreased mA(s) values relative to the increase in CNR compared to the standard FBP protocol. CNR values were compared in this stage using Paired t-test at p = 0.05 level. Results: Images reconstructed using IR technique had higher CNR values (p < 0.01.) in all regions compared to the FBP images, at all strengths of IR. The CNR increased with increasing IR strength of up to 3, in the head and chest images. Increases beyond this strength were insignificant. In abdomen images, CNR continued to increase up to strength 5. The results also indicated that, IR techniques improve CNR by a up to factor of 1.5. Based on the CNR values at strength 3 of IR images and CNR values of FBP images, a reduction in mA(s) of about 20% was identified. The images of the head acquired at 20% reduced mA(s) and reconstructed using IR at strength 3, had similar CNR as FBP images at standard mA(s). In the head scans of the phantom used in this study, it was demonstrated that similar CNR can be achieved even when the mA(s) is reduced by about 20% if IR technique with strength of 3 is used for reconstruction. Conclusions: The IR technique produced better image quality at all strengths of IR in comparison to FBP. IR technique can provide approximately 20% dose reduction in pediatric head CT while maintaining the same image quality as FBP technique.Keywords: filtered back projection, image quality, iterative reconstruction, pediatric computed tomography imaging
Procedia PDF Downloads 14828152 Biological Treatment of Tannery Wastewater Using Pseudomonas Strains
Authors: A. Benhadji, R. Maachi
Abstract:
Environmental protection has become a major economic development issues. Indeed, the environment has become both market growth factor and element of competition. It is now an integral part of all industrial strategies. Ecosystem protection is based on the reduction of the pollution load in the treatment of liquid waste. The physicochemical techniques are commonly used which a transfer of pollution is generally found. Alternative to physicochemical methods is the use of microorganisms for cleaning up the waste waters. The objective of this research is the evaluation of the effects of exogenous added Pseudomonas strains on pollutants biodegradation. The influence of the critical parameters such as inoculums concentration and duration treatment are studied. The results show that Pseudomonas putida is found to give a maximum reduction in chemical organic demand (COD) in 4 days of incubation. However, toward to protect biological pollution of environment, the treatment is achieved by electro coagulation process using aluminium electrodes. The results indicate that this process allows disinfecting the water and improving the electro coagulated sludge quality.Keywords: tannery, pseudomonas, biological treatment, electrocoagulation process, sludge quality
Procedia PDF Downloads 36928151 Healthcare Big Data Analytics Using Hadoop
Authors: Chellammal Surianarayanan
Abstract:
Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare
Procedia PDF Downloads 41528150 Design of Semi-Automatic Vent and Flash Remover
Authors: Inba Blesso P., Senthil Kumar P.
Abstract:
The main consideration of any tire manufacturing process is wear resistance. One of the factors that cause tire wear is improper removal of vent and flash from the tire surface. The contact point between tyre surface and vent is highly supposed to wear. When the vehicle running at higher speed with heavy load, the tire vent and flash is wearing initially and it makes few of the tire surface material to wear along with it. Hence, provision must be given to efficient removal vent and flash thereby tire wear. Human efforts in trimming of tire vent results in time consuming and inaccurate output. Hence, this lead to the reduction in production rate and profit. Thus, the development of automated system can helps to attain minimum time consumption and provide a possible way to get the profitable production. Semi-automated system that employs Pneumatic actuators and sequencing circuits are focused in this study. By implementing this, one can achieve the accurate results with reduction in time and profitable output.Keywords: tire manufacturing, pneumatic system, vent and flash removal, engineering and technology
Procedia PDF Downloads 38128149 LTF Expression Profiling Which is Essential for Cancer Cell Proliferation and Metastasis, Correlating with Clinical Features, as Well as Early Stages of Breast Cancer
Authors: Azar Heidarizadi, Mahdieh Salimi, Hossein Mozdarani
Abstract:
Introduction: As a complex disease, breast cancer results from several genetic and epigenetic changes. Lactoferrin, a member of the transferrin family, is reported to have a number of biological functions, including DNA synthesis, immune responses, iron transport, etc., any of which could play a role in tumor progression. The aim of this study was to investigate the bioinformatics data and experimental assay to find the pattern of promoter methylation and gene expression of LTF in breast cancer in order to study its potential role in cancer management. Material and Methods: In order to evaluate the methylation status of the LTF promoter, we studied the MS-PCR and Real-Time PCR on samples from patients with breast cancer and normal cases. 67 patient samples were conducted for this study, including tumoral, plasma, and normal tissue adjacent samples, as well as 30 plasma from normal cases and 10 tissue breast reduction cases. Subsequently, bioinformatics analyses such as cBioPortal databases, string, and genomatix were conducted to disclose the prognostic value of LTF in breast cancer progression. Results: The analysis of LTF expression showed an inverse relationship between the expression level of LTF and the stages of tissues of breast cancer patients (p<0.01). In fact, stages 1 and 2 had a high expression in LTF, while, in stages 3 and 4, a significant reduction was observable (p < 0.0001). LTF expression frequently alters with a decrease in the expression in ER⁺, PR⁺, and HER2⁺ patients (P < 0.01) and an increase in the expression in the TNBC, LN¯, ER¯, and PR- patients (P < 0.001). Also, LTF expression is significantly associated with metastasis and lymph node involvement factors (P < 0.0001). The sensitivity and specificity of LTF were detected, respectively. A negative correlation was detected between the results of level expression and methylation of the LTF promoter. Conclusions: The altered expression of LTF observed in breast cancer patients could be considered as a promotion in cell proliferation and metastasis even in the early stages of cancer.Keywords: LTF, expression, methylation, breast cancer
Procedia PDF Downloads 7128148 Data Disorders in Healthcare Organizations: Symptoms, Diagnoses, and Treatments
Authors: Zakieh Piri, Shahla Damanabi, Peyman Rezaii Hachesoo
Abstract:
Introduction: Healthcare organizations like other organizations suffer from a number of disorders such as Business Sponsor Disorder, Business Acceptance Disorder, Cultural/Political Disorder, Data Disorder, etc. As quality in healthcare care mostly depends on the quality of data, we aimed to identify data disorders and its symptoms in two teaching hospitals. Methods: Using a self-constructed questionnaire, we asked 20 questions in related to quality and usability of patient data stored in patient records. Research population consisted of 150 managers, physicians, nurses, medical record staff who were working at the time of study. We also asked their views about the symptoms and treatments for any data disorders they mentioned in the questionnaire. Using qualitative methods we analyzed the answers. Results: After classifying the answers, we found six main data disorders: incomplete data, missed data, late data, blurred data, manipulated data, illegible data. The majority of participants believed in their important roles in treatment of data disorders while others believed in health system problems. Discussion: As clinicians have important roles in producing of data, they can easily identify symptoms and disorders of patient data. Health information managers can also play important roles in early detection of data disorders by proactively monitoring and periodic check-ups of data.Keywords: data disorders, quality, healthcare, treatment
Procedia PDF Downloads 43428147 Affect of Reservoir Fluctuations on an Active Landslide in the Xiangjiaba Reservoir Area, Southwest China
Authors: Javed Iqbal
Abstract:
Filling of Xiangjiaba Reservoir Lake in Southwest China triggered and re-activated numerous landslides due to water fluctuation. In order to understand the relationship between reservoirs and slope instability, a typical reservoir landslide (Dasha landslide) at right bank of Jinsha River was selected as a case study for in-depth investigations. The detailed field investigations were carried out in order to identify the landslide with respect to its surroundings and to find out the slip-surface. Boreholes were drilled in order to find out the subsurface lithology and the depth of failure of Dasha landslide. The in-situ geotechnical tests were performed, and the soil samples from exposed slip surface were retrieved for geotechnical laboratory analysis. Finally, stability analysis was done using 3D strength reduction method under different conditions of reservoir water level fluctuations and rainfall conditions. The in-depth investigations show that the Dasha landslide is a bedding rockslide which was once activated in 1986. The topography of Dasha landslide is relatively flat, while the back scarp and local terrain are relatively steep. The landslide area is about 29 × 104 m², and the maximum thickness of the landslide deposits revealed by drilling is about 40 m with the average thickness being about 20 m, and the volume is thus estimated being about 580 × 10⁴ m³. Bedrock in the landslide area is composed of Suining Formation of Jurassic age. The main rock type is silty mudstone with sandstone, and bedding orientation is 300~310° ∠ 7~22°. The factor of safety (FOS) of Dasha landslide obtained by 3D strength reduction cannot meet the minimum safety requirement under the working condition of reservoir level fluctuation as designed, with effect of rainfall and rapid drawdown.Keywords: Dasha landslide, Xiangjiaba reservoir, strength reduction method, bedding rockslide
Procedia PDF Downloads 16228146 Big Data and Analytics in Higher Education: An Assessment of Its Status, Relevance and Future in the Republic of the Philippines
Authors: Byron Joseph A. Hallar, Annjeannette Alain D. Galang, Maria Visitacion N. Gumabay
Abstract:
One of the unique challenges provided by the twenty-first century to Philippine higher education is the utilization of Big Data. The higher education system in the Philippines is generating burgeoning amounts of data that contains relevant data that can be used to generate the information and knowledge needed for accurate data-driven decision making. This study examines the status, relevance and future of Big Data and Analytics in Philippine higher education. The insights gained from the study may be relevant to other developing nations similarly situated as the Philippines.Keywords: big data, data analytics, higher education, republic of the philippines, assessment
Procedia PDF Downloads 34928145 Active Vibration Reduction for a Flexible Structure Bonded with Sensor/Actuator Pairs on Efficient Locations Using a Developed Methodology
Authors: Ali H. Daraji, Jack M. Hale, Ye Jianqiao
Abstract:
With the extensive use of high specific strength structures to optimise the loading capacity and material cost in aerospace and most engineering applications, much effort has been expended to develop intelligent structures for active vibration reduction and structural health monitoring. These structures are highly flexible, inherently low internal damping and associated with large vibration and long decay time. The modification of such structures by adding lightweight piezoelectric sensors and actuators at efficient locations integrated with an optimal control scheme is considered an effective solution for structural vibration monitoring and controlling. The size and location of sensor and actuator are important research topics to investigate their effects on the level of vibration detection and reduction and the amount of energy provided by a controller. Several methodologies have been presented to determine the optimal location of a limited number of sensors and actuators for small-scale structures. However, these studies have tackled this problem directly, measuring the fitness function based on eigenvalues and eigenvectors achieved with numerous combinations of sensor/actuator pair locations and converging on an optimal set using heuristic optimisation techniques such as the genetic algorithms. This is computationally expensive for small- and large-scale structures subject to optimise a number of s/a pairs to suppress multiple vibration modes. This paper proposes an efficient method to determine optimal locations for a limited number of sensor/actuator pairs for active vibration reduction of a flexible structure based on finite element method and Hamilton’s principle. The current work takes the simplified approach of modelling a structure with sensors at all locations, subjecting it to an external force to excite the various modes of interest and noting the locations of sensors giving the largest average percentage sensors effectiveness measured by dividing all sensor output voltage over the maximum for each mode. The methodology was implemented for a cantilever plate under external force excitation to find the optimal distribution of six sensor/actuator pairs to suppress the first six modes of vibration. It is shown that the results of the optimal sensor locations give good agreement with published optimal locations, but with very much reduced computational effort and higher effectiveness. Furthermore, it is shown that collocated sensor/actuator pairs placed in these locations give very effective active vibration reduction using optimal linear quadratic control scheme.Keywords: optimisation, plate, sensor effectiveness, vibration control
Procedia PDF Downloads 23428144 A Neural Network for the Prediction of Contraction after Burn Injuries
Authors: Ginger Egberts, Marianne Schaaphok, Fred Vermolen, Paul van Zuijlen
Abstract:
A few years ago, a promising morphoelastic model was developed for the simulation of contraction formation after burn injuries. Contraction can lead to a serious reduction in physical mobility, like a reduction in the range-of-motion of joints. If this is the case in a healing burn wound, then this is referred to as a contracture that needs medical intervention. The morphoelastic model consists of a set of partial differential equations describing both a chemical part and a mechanical part in dermal wound healing. These equations are solved with the numerical finite element method (FEM). In this method, many calculations are required on each of the chosen elements. In general, the more elements, the more accurate the solution. However, the number of elements increases rapidly if simulations are performed in 2D and 3D. In that case, it not only takes longer before a prediction is available, the computation also becomes more expensive. It is therefore important to investigate alternative possibilities to generate the same results, based on the input parameters only. In this study, a surrogate neural network has been designed to mimic the results of the one-dimensional morphoelastic model. The neural network generates predictions quickly, is easy to implement, and there is freedom in the choice of input and output. Because a neural network requires extensive training and a data set, it is ideal that the one-dimensional FEM code generates output quickly. These feed-forward-type neural network results are very promising. Not only can the network give faster predictions, but it also has a performance of over 99%. It reports on the relative surface area of the wound/scar, the total strain energy density, and the evolutions of the densities of the chemicals and mechanics. It is, therefore, interesting to investigate the applicability of a neural network for the two- and three-dimensional morphoelastic model for contraction after burn injuries.Keywords: biomechanics, burns, feasibility, feed-forward NN, morphoelasticity, neural network, relative surface area wound
Procedia PDF Downloads 5628143 Stereo Camera Based Speed-Hump Detection Process for Real Time Driving Assistance System in the Daytime
Authors: Hyun-Koo Kim, Yong-Hun Kim, Soo-Young Suk, Ju H. Park, Ho-Youl Jung
Abstract:
This paper presents an effective speed hump detection process at the day-time. we focus only on round types of speed humps in the day-time dynamic road environment. The proposed speed hump detection scheme consists mainly of two process as stereo matching and speed hump detection process. Our proposed process focuses to speed hump detection process. Speed hump detection process consist of noise reduction step, data fusion step, and speed hemp detection step. The proposed system is tested on Intel Core CPU with 2.80 GHz and 4 GB RAM tested in the urban road environments. The frame rate of test videos is 30 frames per second and the size of each frame of grabbed image sequences is 1280 pixels by 670 pixels. Using object-marked sequences acquired with an on-vehicle camera, we recorded speed humps and non-speed humps samples. Result of the tests, our proposed method can be applied in real-time systems by computation time is 13 ms. For instance; our proposed method reaches 96.1 %.Keywords: data fusion, round types speed hump, speed hump detection, surface filter
Procedia PDF Downloads 51328142 Mechanical Properties and Microstructural Analysis of Al6061-Red Mud Composites
Authors: M. Gangadharappa, M. Ravi Kumar, H. N. Reddappa
Abstract:
The mechanical properties and morphological analysis of Al6061-Red mud particulate composites were investigated. The compositions of the composite include a matrix of Al6061 and the red mud particles of 53-75 micron size as reinforcement ranging from 0% to 12% at an interval of 2%. Stir casting technique was used to fabricate Al6061-Red mud composites. Density measurement, estimation of percentage porosity, tensile properties, fracture toughness, hardness value, impact energy, percentage elongation and percentage reduction in area. Further, the microstructures and SEM examinations were investigated to characterize the composites produced. The result shows that a uniform dispersion of the red mud particles along the grain boundaries of the Al6061 alloy. The tensile strength and hardness values increases with the addition of Red mud particles, but there is a slight decrease in the impact energy values, values of percentage elongation and percentage reduction in area as the reinforcement increases. From these results of investigation, we concluded that the red mud, an industrial waste can be used to enhance the properties of Al6061 alloy for engineering applications.Keywords: Al6061, red mud, tensile strength, hardness and microstructures
Procedia PDF Downloads 56428141 The Acute Effects of Higher Versus Lower Load Duration and Intensity on Morphological and Mechanical Properties of the Healthy Achilles Tendon: A Randomized Crossover Trial
Authors: Eman Merza, Stephen Pearson, Glen Lichtwark, Peter Malliaras
Abstract:
The Achilles tendon (AT) exhibits volume changes related to fluid flow under acute load which may be linked to changes in stiffness. Fluid flow provides a mechanical signal for cellular activity and may be one mechanism that facilitates tendon adaptation. This study aimed to investigate whether isometric intervention involving a high level of load duration and intensity could maximize the immediate reduction in AT volume and stiffness compared to interventions involving a lower level of load duration and intensity. Sixteen healthy participants (12 males, 4 females; age= 24.4 ± 9.4 years; body mass= 70.9 ± 16.1 kg; height= 1.7 ± 0.1 m) performed three isometric interventions of varying levels of load duration (2 s and 8 s) and intensity (35% and 75% maximal voluntary isometric contraction) over a 3 week period. Freehand 3D ultrasound was used to measure free AT volume (at rest) and length (at 35%, 55%, and 75% of maximum plantarflexion force) pre- and post-interventions. The slope of the force-elongation curve over these force levels represented individual stiffness (N/mm). Large reductions in free AT volume and stiffness resulted in response to long-duration high-intensity loading whilst less reduction was produced with a lower load intensity. In contrast, no change in free AT volume and a small increase in AT stiffness occurred with lower load duration. These findings suggest that the applied load on the AT must be heavy and sustained for a long duration to maximize immediate volume reduction, which might be an acute response that enables optimal long-term tendon adaptation via mechanotransduction pathways.Keywords: Achilles tendon, volume, stiffness, free tendon, 3d ultrasound
Procedia PDF Downloads 10228140 A Qualitative Study Examining the Process of EFL Course Design from the Perspectives of Teachers
Authors: Iman Al Khalidi
Abstract:
Recently, English has become the language of globalization and technology. In turn, this has resulted in a seemingly bewildering array of influences and trends in the domain of TESOL curriculum. In light of these changes, higher education has to provide a new and more powerful kind of education. It should prepare students to be more engaged citizens, more capable to solve complex problems at work, and well prepared to lead meaningful life. In response to this, universities, colleges, schools, and departments have to work out in light of the requirements and challenges of the global and technological era. Consequently they have to focus on the adoption of contemporary curriculum which goes in line with the pedagogical shifts from teaching –centered approach to learning centered approach. Ideally, there has been noticeable emphasis on the crucial importance of developing and professionalizing teachers in order to engage them in the process of curriculum development and action research. This is a qualitative study that aims at understanding and exploring the process of designing EFL courses by teachers at the tertiary level from the perspectives of the participants in a professional context in TESOL, Department of English, a private college in Oman. It is a case study that stands on the philosophy of the qualitative approach. It employs multi methods for collecting qualitative data: semi-structured interviews with teachers, focus group discussions with students, and document analysis. The collected data have been analyzed qualitatively by adopting Miles and Huberman's Approach using procedures of reduction, coding, displaying and conclusion drawing and verification.Keywords: course design, components of course design, case study, data analysis
Procedia PDF Downloads 54628139 A Qualitative Study Examining the Process of Course Design from the Perspectives of Teachers
Authors: Iman Al Khalidi
Abstract:
Recently, English has become the language of globalization and technology. In turn, this has resulted in a seemingly bewildering array of influences and trends in the domain of TESOL curriculum. In light of these changes, higher education has to provide a new and more powerful kind of education. It should prepare students to be more engaged citizens, more capable to solve complex problems at work, and well prepared to lead a meaningful life. In response to this, universities, colleges, schools, and departments have to work out in light of the requirements and challenges of the global and technological era. Consequently, they have to focus on the adoption of contemporary curriculum which goes in line with the pedagogical shifts from teaching –centered approach to learning centered approach. Ideally, there has been noticeable emphasis on the crucial importance of developing and professionalizing teachers in order to engage them in the process of curriculum development and action research. This is a qualitative study that aims at understanding and exploring the process of designing EFL courses by teachers at the tertiary level from the perspectives of the participants in a professional context in TESOL, Department of English, a private college in Oman. It is a case study that stands on the philosophy of the qualitative approach. It employs multi-methods for collecting qualitative data: semi-structured interviews with teachers, focus group discussions with students, and document analysis. The collected data have been analyzed qualitatively by adopting Miles and Huberman's Approach using procedures of reduction, coding, displaying, and conclusion drawing and verification.Keywords: course design, components of course design, case study, data analysis
Procedia PDF Downloads 44228138 Data Management and Analytics for Intelligent Grid
Authors: G. Julius P. Roy, Prateek Saxena, Sanjeev Singh
Abstract:
Power distribution utilities two decades ago would collect data from its customers not later than a period of at least one month. The origin of SmartGrid and AMI has subsequently increased the sampling frequency leading to 1000 to 10000 fold increase in data quantity. This increase is notable and this steered to coin the tern Big Data in utilities. Power distribution industry is one of the largest to handle huge and complex data for keeping history and also to turn the data in to significance. Majority of the utilities around the globe are adopting SmartGrid technologies as a mass implementation and are primarily focusing on strategic interdependence and synergies of the big data coming from new information sources like AMI and intelligent SCADA, there is a rising need for new models of data management and resurrected focus on analytics to dissect data into descriptive, predictive and dictatorial subsets. The goal of this paper is to is to bring load disaggregation into smart energy toolkit for commercial usage.Keywords: data management, analytics, energy data analytics, smart grid, smart utilities
Procedia PDF Downloads 78028137 Effect of SCN5A Gene Mutation in Endocardial Cell
Authors: Helan Satish, M. Ramasubba Reddy
Abstract:
The simulation of an endocardial cell for gene mutation in the cardiac sodium ion channel NaV1.5, encoded by SCN5A gene, is discussed. The characterization of Brugada Syndrome by loss of function effect on SCN5A mutation due to L812Q mutant present in the DII-S4 transmembrane region of the NaV1.5 channel protein and its effect in an endocardial cell is studied. Ten Tusscher model of human ventricular action potential is modified to incorporate the changes contributed by L812Q mutant in the endocardial cells. Results show that BrS-associated SCN5A mutation causes reduction in the inward sodium current by modifications in the channel gating dynamics such as delayed activation, enhanced inactivation, and slowed recovery from inactivation in the endocardial cell. A decrease in the inward sodium current was also observed, which affects depolarization phase (Phase 0) that leads to reduction in the spike amplitude of the cardiac action potential.Keywords: SCN5A gene mutation, sodium channel, Brugada syndrome, cardiac arrhythmia, action potential
Procedia PDF Downloads 12628136 Progressive Loading Effect of Co Over SiO2/Al2O3 Catalyst for Cox Free Hydrogen and Carbon Nanotubes Production via Catalytic Decomposition of Methane
Authors: Sushil Kumar Saraswat, K. K. Pant
Abstract:
Co metal supported on SiO2 and Al2O3 catalysts with a metal loading varied from 30 of 70 wt.% were evaluated for decomposition of methane to CO/CO2 free hydrogen and carbon nano materials. The catalytic runs were carried out from 550-800 oC under atmospheric pressure using fixed bed vertical flow reactor. The fresh and spent catalysts were characterized by BET surface area analyzer, TPR, XRD, SEM, TEM, and TG analysis. The data showed that 50% Co/Al2O3 catalyst exhibited remarkable higher activity and stability up to 10 h time-on-stream at 750 oC with respect to H2 production compared to rest of the catalysts. However, the catalytic activity and durability was greatly declined at a higher temperature. The main reason for the catalytic inhibition of Co containing SiO2 catalysts is the higher reduction temperature of Co2SiO4. TEM images illustrate that the carbon materials with various morphologies, carbon nanofibers (CNFs), helical-shaped CNFs, and branched CNFs depending on the catalyst composition and reaction temperature, were obtained. The TG data showed that a higher yield of MWCNTs was achieved over 50% Co/Al2O3 catalyst compared to other catalysts.Keywords: carbon nanotubes, cobalt, hydrogen production, methane decomposition
Procedia PDF Downloads 32328135 Optimizing Irrigation Scheduling for Sustainable Agriculture: A Case Study of a Farm in Onitsha, Anambra State, Nigeria
Authors: Ejoh Nonso Francis
Abstract:
: Irrigation scheduling is a critical aspect of sustainable agriculture as it ensures optimal use of water resources, reduces water waste, and enhances crop yields. This paper presents a case study of a farm in Onitsha, Anambra State, Nigeria, where irrigation scheduling was optimized using a combination of soil moisture sensors and weather data. The study aimed to evaluate the effectiveness of this approach in improving water use efficiency and crop productivity. The results showed that the optimized irrigation scheduling approach led to a 30% reduction in water use while increasing crop yield by 20%. The study demonstrates the potential of technology-based irrigation scheduling to enhance sustainable agriculture in Nigeria and beyond.Keywords: irrigation scheduling, sustainable agriculture, soil moisture sensors, weather data, water use efficiency, crop productivity, nigeria, onitsha, anambra state, technology-based irrigation scheduling, water resources, environmental degradation, crop water requirements, overwatering, water waste, farming systems, scalability
Procedia PDF Downloads 7828134 Privacy Preserving Data Publishing Based on Sensitivity in Context of Big Data Using Hive
Authors: P. Srinivasa Rao, K. Venkatesh Sharma, G. Sadhya Devi, V. Nagesh
Abstract:
Privacy Preserving Data Publication is the main concern in present days because the data being published through the internet has been increasing day by day. This huge amount of data was named as Big Data by its size. This project deals the privacy preservation in the context of Big Data using a data warehousing solution called hive. We implemented Nearest Similarity Based Clustering (NSB) with Bottom-up generalization to achieve (v,l)-anonymity. (v,l)-Anonymity deals with the sensitivity vulnerabilities and ensures the individual privacy. We also calculate the sensitivity levels by simple comparison method using the index values, by classifying the different levels of sensitivity. The experiments were carried out on the hive environment to verify the efficiency of algorithms with Big Data. This framework also supports the execution of existing algorithms without any changes. The model in the paper outperforms than existing models.Keywords: sensitivity, sensitive level, clustering, Privacy Preserving Data Publication (PPDP), bottom-up generalization, Big Data
Procedia PDF Downloads 29728133 A Subband BSS Structure with Reduced Complexity and Fast Convergence
Authors: Salah Al-Din I. Badran, Samad Ahmadi, Ismail Shahin
Abstract:
A blind source separation method is proposed; in this method, we use a non-uniform filter bank and a novel normalisation. This method provides a reduced computational complexity and increased convergence speed comparing to the full-band algorithm. Recently, adaptive sub-band scheme has been recommended to solve two problems: reduction of computational complexity and increase the convergence speed of the adaptive algorithm for correlated input signals. In this work, the reduction in computational complexity is achieved with the use of adaptive filters of orders less than the full-band adaptive filters, which operate at a sampling rate lower than the sampling rate of the input signal. The decomposed signals by analysis bank filter are less correlated in each subband than the input signal at full bandwidth, and can promote better rates of convergence.Keywords: blind source separation, computational complexity, subband, convergence speed, mixture
Procedia PDF Downloads 58028132 A Fuzzy Kernel K-Medoids Algorithm for Clustering Uncertain Data Objects
Authors: Behnam Tavakkol
Abstract:
Uncertain data mining algorithms use different ways to consider uncertainty in data such as by representing a data object as a sample of points or a probability distribution. Fuzzy methods have long been used for clustering traditional (certain) data objects. They are used to produce non-crisp cluster labels. For uncertain data, however, besides some uncertain fuzzy k-medoids algorithms, not many other fuzzy clustering methods have been developed. In this work, we develop a fuzzy kernel k-medoids algorithm for clustering uncertain data objects. The developed fuzzy kernel k-medoids algorithm is superior to existing fuzzy k-medoids algorithms in clustering data sets with non-linearly separable clusters.Keywords: clustering algorithm, fuzzy methods, kernel k-medoids, uncertain data
Procedia PDF Downloads 21628131 A Design of Elliptic Curve Cryptography Processor based on SM2 over GF(p)
Authors: Shiji Hu, Lei Li, Wanting Zhou, DaoHong Yang
Abstract:
The data encryption, is the foundation of today’s communication. On this basis, how to improve the speed of data encryption and decryption is always a problem that scholars work for. In this paper, we proposed an elliptic curve crypto processor architecture based on SM2 prime field. In terms of hardware implementation, we optimized the algorithms in different stages of the structure. In finite field modulo operation, we proposed an optimized improvement of Karatsuba-Ofman multiplication algorithm, and shorten the critical path through pipeline structure in the algorithm implementation. Based on SM2 recommended prime field, a fast modular reduction algorithm is used to reduce 512-bit wide data obtained from the multiplication unit. The radix-4 extended Euclidean algorithm was used to realize the conversion between affine coordinate system and Jacobi projective coordinate system. In the parallel scheduling of point operations on elliptic curves, we proposed a three-level parallel structure of point addition and point double based on the Jacobian projective coordinate system. Combined with the scalar multiplication algorithm, we added mutual pre-operation to the point addition and double point operation to improve the efficiency of the scalar point multiplication. The proposed ECC hardware architecture was verified and implemented on Xilinx Virtex-7 and ZYNQ-7 platforms, and each 256-bit scalar multiplication operation took 0.275ms. The performance for handling scalar multiplication is 32 times that of CPU(dual-core ARM Cortex-A9).Keywords: Elliptic curve cryptosystems, SM2, modular multiplication, point multiplication.
Procedia PDF Downloads 10028130 An Experimental Study on the Temperature Reduction of Exhaust Gas at a Snorkeling of Submarine
Authors: Seok-Tae Yoon, Jae-Yeong Choi, Gyu-Mok Jeon, Yong-Jin Cho, Jong-Chun Park
Abstract:
Conventional submarines obtain propulsive force by using an electric propulsion system consisting of a diesel generator, battery, motor, and propeller. In the underwater, the submarine uses the electric power stored in the battery. After that, when a certain amount of electric power is consumed, the submarine floats near the sea water surface and recharges the electric power by using the diesel generator. The voyage carried out while charging the power is called a snorkel, and the high-temperature exhaust gas from the diesel generator forms a heat distribution on the sea water surface. The heat distribution is detected by weapon system equipped with thermo-detector and that is the main cause of reducing the survivability of the submarine. In this paper, an experimental study was carried out to establish optimal operating conditions of a submarine for reduction of infrared signature radiated from the sea water surface. For this, a hot gas generating system and a round acrylic water tank with adjustable water level were made. The control variables of the experiment were set as the mass flow rate, the temperature difference between the water and the hot gas in the water tank, and the water level difference between the air outlet and the water surface. The experimental instrumentation used a thermocouple of T-type to measure the released air temperature on the surface of the water, and a thermography system to measure the thermal energy distribution on the water surface. As a result of the experiment study, we analyzed the correlation between the final released temperature of the exhaust pipe exit in a submarine and the depth of the snorkel, and presented reasonable operating conditions for the infrared signature reduction of submarine.Keywords: experiment study, flow rate, infrared signature, snorkeling, thermography
Procedia PDF Downloads 35228129 Uncertainty and Optimization Analysis Using PETREL RE
Authors: Ankur Sachan
Abstract:
The ability to make quick yet intelligent and value-added decisions to develop new fields has always been of great significance. In situations where the capital expenses and subsurface risk are high, carefully analyzing the inherent uncertainties in the reservoir and how they impact the predicted hydrocarbon accumulation and production becomes a daunting task. The problem is compounded in offshore environments, especially in the presence of heavy oils and disconnected sands where the margin for error is small. Uncertainty refers to the degree to which the data set may be in error or stray from the predicted values. To understand and quantify the uncertainties in reservoir model is important when estimating the reserves. Uncertainty parameters can be geophysical, geological, petrophysical etc. Identification of these parameters is necessary to carry out the uncertainty analysis. With so many uncertainties working at different scales, it becomes essential to have a consistent and efficient way of incorporating them into our analysis. Ranking the uncertainties based on their impact on reserves helps to prioritize/ guide future data gathering and uncertainty reduction efforts. Assigning probabilistic ranges to key uncertainties also enables the computation of probabilistic reserves. With this in mind, this paper, with the help the uncertainty and optimization process in petrel RE shows how the most influential uncertainties can be determined efficiently and how much impact so they have on the reservoir model thus helping in determining a cost effective and accurate model of the reservoir.Keywords: uncertainty, reservoir model, parameters, optimization analysis
Procedia PDF Downloads 66128128 Democracy Bytes: Interrogating the Exploitation of Data Democracy by Radical Terrorist Organizations
Authors: Nirmala Gopal, Sheetal Bhoola, Audecious Mugwagwa
Abstract:
This paper discusses the continued infringement and exploitation of data by non-state actors for destructive purposes, emphasizing radical terrorist organizations. It will discuss how terrorist organizations access and use data to foster their nefarious agendas. It further examines how cybersecurity, designed as a tool to curb data exploitation, is ineffective in raising global citizens' concerns about how their data can be kept safe and used for its acquired purpose. The study interrogates several policies and data protection instruments, such as the Data Protection Act, Cyber Security Policies, Protection of Personal Information(PPI) and General Data Protection Regulations (GDPR), to understand data use and storage in democratic states. The study outcomes point to the fact that international cybersecurity and cybercrime legislation, policies, and conventions have not curbed violations of data access and use by radical terrorist groups. The study recommends ways to enhance cybersecurity and reduce cyber risks using democratic principles.Keywords: cybersecurity, data exploitation, terrorist organizations, data democracy
Procedia PDF Downloads 20528127 Healthcare Data Mining Innovations
Authors: Eugenia Jilinguirian
Abstract:
In the healthcare industry, data mining is essential since it transforms the field by collecting useful data from large datasets. Data mining is the process of applying advanced analytical methods to large patient records and medical histories in order to identify patterns, correlations, and trends. Healthcare professionals can improve diagnosis accuracy, uncover hidden linkages, and predict disease outcomes by carefully examining these statistics. Additionally, data mining supports personalized medicine by personalizing treatment according to the unique attributes of each patient. This proactive strategy helps allocate resources more efficiently, enhances patient care, and streamlines operations. However, to effectively apply data mining, however, and ensure the use of private healthcare information, issues like data privacy and security must be carefully considered. Data mining continues to be vital for searching for more effective, efficient, and individualized healthcare solutions as technology evolves.Keywords: data mining, healthcare, big data, individualised healthcare, healthcare solutions, database
Procedia PDF Downloads 6828126 Iranian Sexual Health Needs in Viewpoint of Policy Makers: A Qualitative Study
Authors: Mahnaz Motamedi, Mohammad Shahbazi, Shahrzad Rahimi-Naghani, Mehrdad Salehi
Abstract:
Introduction: Identifying sexual health needs, developing appropriate plans, and delivering services to meet those needs is an essential component of health programs for women, men, and children all over the world, especially in poor countries. Main Subject: The aim of this study was to describe the needs of sexual health from the viewpoint of health policymakers in Iran. Methods: A qualitative study using thematic content analysis was designed and conducted. Data gathering was conducted through semi-structured, in-depth interviews with 25 key informants within the healthcare system. Key informants were selected through both purposive and snowball sampling. MAXQUDA software (version 10) was used to facilitate transcription, classification of codes, and conversion of data into meaningful units, by the process of reduction and compression. Results: The analysis of narratives and information categorized sexual health needs into five categories: culturalization of sexual health discourse, sexual health care services, sexual health educational needs, sexual health research needs, and organizational needs. Conclusion: Identifying and explaining sexual health needs is an important factor in determining the priority of sexual health programs and identification of barriers to meet these needs. This can help other policymakers and health planners to develop appropriate programs to promote sexual and reproductive health.Keywords: sexual health, sexual health needs, policy makers, health system, qualitative study
Procedia PDF Downloads 22328125 Summarizing Data Sets for Data Mining by Using Statistical Methods in Coastal Engineering
Authors: Yunus Doğan, Ahmet Durap
Abstract:
Coastal regions are the one of the most commonly used places by the natural balance and the growing population. In coastal engineering, the most valuable data is wave behaviors. The amount of this data becomes very big because of observations that take place for periods of hours, days and months. In this study, some statistical methods such as the wave spectrum analysis methods and the standard statistical methods have been used. The goal of this study is the discovery profiles of the different coast areas by using these statistical methods, and thus, obtaining an instance based data set from the big data to analysis by using data mining algorithms. In the experimental studies, the six sample data sets about the wave behaviors obtained by 20 minutes of observations from Mersin Bay in Turkey and converted to an instance based form, while different clustering techniques in data mining algorithms were used to discover similar coastal places. Moreover, this study discusses that this summarization approach can be used in other branches collecting big data such as medicine.Keywords: clustering algorithms, coastal engineering, data mining, data summarization, statistical methods
Procedia PDF Downloads 361