Search results for: powder processing
2591 Tele-Monitoring and Logging of Patient Health Parameters Using Zigbee
Authors: Kirubasankar, Sanjeevkumar, Aravindh Nagappan
Abstract:
This paper addresses a system for monitoring patients using biomedical sensors and displaying it in a remote place. The main challenges in present health monitoring devices are lack of remote monitoring and logging for future evaluation. Typical instruments used for health parameter measurement provide basic information regarding health status. This paper identifies a set of design principles to address these challenges. This system includes continuous measurement of health parameters such as Heart rate, electrocardiogram, SpO2 level and Body temperature. The accumulated sensor data is relayed to a processing device using a transceiver and viewed by the implementation of cloud services.Keywords: bio-medical sensors, monitoring, logging, cloud service
Procedia PDF Downloads 5202590 Apatite Flotation Using Fruits' Oil as Collector and Sorghum as Depressant
Authors: Elenice Maria Schons Silva, Andre Carlos Silva
Abstract:
The crescent demand for raw material has increased mining activities. Mineral industry faces the challenge of process more complexes ores, with very small particles and low grade, together with constant pressure to reduce production costs and environment impacts. Froth flotation deserves special attention among the concentration methods for mineral processing. Besides its great selectivity for different minerals, flotation is a high efficient method to process fine particles. The process is based on the minerals surficial physicochemical properties and the separation is only possible with the aid of chemicals such as collectors, frothers, modifiers, and depressants. In order to use sustainable and eco-friendly reagents, oils extracted from three different vegetable species (pequi’s pulp, macauba’s nut and pulp, and Jatropha curcas) were studied and tested as apatite collectors. Since the oils are not soluble in water, an alkaline hydrolysis (or saponification), was necessary before their contact with the minerals. The saponification was performed at room temperature. The tests with the new collectors were carried out at pH 9 and Flotigam 5806, a synthetic mix of fatty acids industrially adopted as apatite collector manufactured by Clariant, was used as benchmark. In order to find a feasible replacement for cornstarch the flour and starch of a graniferous variety of sorghum was tested as depressant. Apatite samples were used in the flotation tests. XRF (X-ray fluorescence), XRD (X-ray diffraction), and SEM/EDS (Scanning Electron Microscopy with Energy Dispersive Spectroscopy) were used to characterize the apatite samples. Zeta potential measurements were performed in the pH range from 3.5 to 12.5. A commercial cornstarch was used as depressant benchmark. Four depressants dosages and pH values were tested. A statistical test was used to verify the pH, dosage, and starch type influence on the minerals recoveries. For dosages equal or higher than 7.5 mg/L, pequi oil recovered almost all apatite particles. In one hand, macauba’s pulp oil showed excellent results for all dosages, with more than 90% of apatite recovery, but in the other hand, with the nut oil, the higher recovery found was around 84%. Jatropha curcas oil was the second best oil tested and more than 90% of the apatite particles were recovered for the dosage of 7.5 mg/L. Regarding the depressant, the lower apatite recovery with sorghum starch were found for a dosage of 1,200 g/t and pH 11, resulting in a recovery of 1.99%. The apatite recovery for the same conditions as 1.40% for sorghum flour (approximately 30% lower). When comparing with cornstarch at the same conditions sorghum flour produced an apatite recovery 91% lower.Keywords: collectors, depressants, flotation, mineral processing
Procedia PDF Downloads 1522589 A Low-Area Fully-Reconfigurable Hardware Design of Fast Fourier Transform System for 3GPP-LTE Standard
Authors: Xin-Yu Shih, Yue-Qu Liu, Hong-Ru Chou
Abstract:
This paper presents a low-area and fully-reconfigurable Fast Fourier Transform (FFT) hardware design for 3GPP-LTE communication standard. It can fully support 32 different FFT sizes, up to 2048 FFT points. Besides, a special processing element is developed for making reconfigurable computing characteristics possible, while first-in first-out (FIFO) scheduling scheme design technique is proposed for hardware-friendly FIFO resource arranging. In a synthesis chip realization via TSMC 40 nm CMOS technology, the hardware circuit only occupies core area of 0.2325 mm2 and dissipates 233.5 mW at maximal operating frequency of 250 MHz.Keywords: reconfigurable, fast Fourier transform (FFT), single-path delay feedback (SDF), 3GPP-LTE
Procedia PDF Downloads 2782588 Eco-Efficient Cementitious Materials for Construction Applications in Ireland
Authors: Eva Ujaczki, Rama Krishna Chinnam, Ronan Courtney, Syed A. M. Tofail, Lisa O'Donoghue
Abstract:
Concrete is the second most widely used material in the world and is made of cement, sand, and aggregates. Cement is a hydraulic binder which reacts with water to form a solid material. In the cement manufacturing process, the right mix of minerals from mined natural rocks, e.g., limestone is melted in a kiln at 1450 °C to form a new compound, clinker. In the final stage, the clinker is milled into a fine cement powder. The principal cement types manufactured in Ireland are: 1) CEM I – Portland cement; 2) CEM II/A – Portland-fly ash cement; 3) CEM II/A – Portland-limestone cement and 4) CEM III/A – Portland-round granulated blast furnace slag (GGBS). The production of eco-efficient, blended cement (CEM II, CEM III) reduces CO₂ emission and improves energy efficiency compared to traditional cements. Blended cements are produced locally in Ireland and more than 80% of produced cement is blended. These eco-efficient, blended cements are a relatively new class of construction materials and a kind of geopolymer binders. From a terminological point of view, geopolymer cement is a binding system that is able to harden at room temperature. Geopolymers do not require calcium-silicate-hydrate gel but utilize the polycondensation of SiO₂ and Al₂O₃ precursors to achieve a superior strength level. Geopolymer materials are usually synthesized using an aluminosilicate raw material and an activating solution which is mainly composed of NaOH or KOH and Na₂SiO₃. Cement is the essential ingredient in concrete which is vital for economic growth of countries. The challenge for the global cement industry is to reach to increasing demand at the same time recognize the need for sustainable usage of resources. Therefore, in this research, we investigated the potential for Irish wastes to be used in geopolymer cement type applications through a national stakeholder workshop with the Irish construction sector and relevant stakeholders. This paper aims at summarizing Irish stakeholder’s perspective for introducing new secondary raw materials, e.g., bauxite residue or increasing the fly ash addition into cement for eco-efficient cement production.Keywords: eco-efficient, cement, geopolymer, blending
Procedia PDF Downloads 1662587 Parallel 2-Opt Local Search on GPU
Authors: Wen-Bao Qiao, Jean-Charles Créput
Abstract:
To accelerate the solution for large scale traveling salesman problems (TSP), a parallel 2-opt local search algorithm with simple implementation based on Graphics Processing Unit (GPU) is presented and tested in this paper. The parallel scheme is based on technique of data decomposition by dynamically assigning multiple K processors on the integral tour to treat K edges’ 2-opt local optimization simultaneously on independent sub-tours, where K can be user-defined or have a function relationship with input size N. We implement this algorithm with doubly linked list on GPU. The implementation only requires O(N) memory. We compare this parallel 2-opt local optimization against sequential exhaustive 2-opt search along integral tour on TSP instances from TSPLIB with more than 10000 cities.Keywords: parallel 2-opt, double links, large scale TSP, GPU
Procedia PDF Downloads 6252586 Teachers' Perceptions of Physical Education and Sports Calendar and Conducted in the Light of the Objective of the Lesson Approach Competencies
Authors: Chelali Mohammed
Abstract:
In the context of the application of the competency-based approach in the system educational Algeria, the price of physical education and sport must privilege the acquisition of learning approaches and especially the approach science, which from problem situations, research and develops him information processing and application of knowledge and know-how in new situations in the words of ‘JOHN DEWEY’ ‘learning by practice’. And to achieve these goals and make teaching more EPS motivating, consistent and concrete, it is appropriate to perform a pedagogical approach freed from the constraints and open to creativity and student-centered in the light of the competency approach adopted in the formal curriculum. This approach is not unusual, but we think it is a highly professional nature requires the competence of the teacher.Keywords: approach competencies, physical, education, teachers
Procedia PDF Downloads 6032585 Biofilm Text Classifiers Developed Using Natural Language Processing and Unsupervised Learning Approach
Authors: Kanika Gupta, Ashok Kumar
Abstract:
Biofilms are dense, highly hydrated cell clusters that are irreversibly attached to a substratum, to an interface or to each other, and are embedded in a self-produced gelatinous matrix composed of extracellular polymeric substances. Research in biofilm field has become very significant, as biofilm has shown high mechanical resilience and resistance to antibiotic treatment and constituted as a significant problem in both healthcare and other industry related to microorganisms. The massive information both stated and hidden in the biofilm literature are growing exponentially therefore it is not possible for researchers and practitioners to automatically extract and relate information from different written resources. So, the current work proposes and discusses the use of text mining techniques for the extraction of information from biofilm literature corpora containing 34306 documents. It is very difficult and expensive to obtain annotated material for biomedical literature as the literature is unstructured i.e. free-text. Therefore, we considered unsupervised approach, where no annotated training is necessary and using this approach we developed a system that will classify the text on the basis of growth and development, drug effects, radiation effects, classification and physiology of biofilms. For this, a two-step structure was used where the first step is to extract keywords from the biofilm literature using a metathesaurus and standard natural language processing tools like Rapid Miner_v5.3 and the second step is to discover relations between the genes extracted from the whole set of biofilm literature using pubmed.mineR_v1.0.11. We used unsupervised approach, which is the machine learning task of inferring a function to describe hidden structure from 'unlabeled' data, in the above-extracted datasets to develop classifiers using WinPython-64 bit_v3.5.4.0Qt5 and R studio_v0.99.467 packages which will automatically classify the text by using the mentioned sets. The developed classifiers were tested on a large data set of biofilm literature which showed that the unsupervised approach proposed is promising as well as suited for a semi-automatic labeling of the extracted relations. The entire information was stored in the relational database which was hosted locally on the server. The generated biofilm vocabulary and genes relations will be significant for researchers dealing with biofilm research, making their search easy and efficient as the keywords and genes could be directly mapped with the documents used for database development.Keywords: biofilms literature, classifiers development, text mining, unsupervised learning approach, unstructured data, relational database
Procedia PDF Downloads 1702584 Bioavailability Enhancement of Ficus religiosa Extract by Solid Lipid Nanoparticles
Authors: Sanjay Singh, Karunanithi Priyanka, Ramoji Kosuru, Raju Prasad Sharma
Abstract:
Herbal drugs are well known for their mixed pharmacological activities with the benefit of no harmful side effects. The use of herbal drugs is limited because of their higher dose requirement, frequent drug administration, poor bioavailability of phytochemicals and delayed onset of action. Ficus religiosa, a potent anti-oxidant plant useful in the treatment of diabetes and cancer was selected for the study. Solid lipid nanoparticles (SLN) of Ficus religiosa extract was developed for the enhancement in oral bioavailability of stigmasterol and β-sitosterol-d-glucoside, principal components present in the extract. Hot homogenization followed by ultrasonication method was used to develop extract loaded SLN. Developed extract loaded SLN were characterized for particle size, PDI, zeta potential, entrapment efficiency, in vitro drug release and kinetics, fourier transform infra-red spectroscopy, differential scanning calorimetry, powder X-ray diffractrometry and stability studies. Entrapment efficiency of optimized extract loaded SLN was found to be 68.46 % (56.13 % of stigmasterol and 12.33 % of β-sitosteryl-d-glucoside, respectively). RP HPLC method development was done for simultaneous estimation of stigmasterol and β-sitosterol-d-glucoside in Ficus religiosa extract in rat plasma. Bioavailability studies were carried out for extract in suspension form and optimized extract loaded SLN. AUC of stigmasterol and β-sitosterol-d-glucoside were increased by 6.7-folds by 9.2-folds, respectively in rats treated with extract loaded SLN compared to extract suspension. Also, Cmax of stigmasterol and β-sitosterol-d-glucoside were increased by 4.3-folds by 3.9-folds, respectively in rats treated with extract loaded SLN compared to extract suspension. Mean residence times (MRT) for stigmasterol were found to be 12.3 ± 0.67 hours from extract and 7.4 ± 2.1 hours from SLN and for β-sitosterol-d-glucoside, 10.49 ± 2.9 hours from extract and 6.4 ± 0.3 hours from SLN. Hence, it was concluded that SLN enhanced the bioavailability and reduced the MRT of stigmasterol and β-sitosterol-d-glucoside in Ficus religiosa extract which in turn may lead to reduction in dose of Ficus religiosa extract, prolonged duration of action and also enhanced therapeutic efficacy.Keywords: Ficus religiosa, phytosterolins, bioavailability, solid lipid nanoparticles, stigmasterol and β-sitosteryl-d-glucoside
Procedia PDF Downloads 4732583 Structural Evolution of Na6Mn(SO4)4 from High-Pressure Synchrotron Powder X-ray Diffraction
Authors: Monalisa Pradhan, Ajana Dutta, Irshad Kariyattuparamb Abbas, Boby Joseph, T. N. Guru Row, Diptikanta Swain, Gopal K. Pradhan
Abstract:
Compounds with the Vanthoffite crystal structure having general formula Na6M(SO₄)₄ (M= Mg, Mn, Ni , Co, Fe, Cu and Zn) display a variety of intriguing physical properties intimately related to their structural arrangements. The compound Na6Mn(SO4)4 shows antiferromagnetic ordering at low temperature where the in-plane Mn-O•••O-Mn interactions facilitates antiferromagnetic ordering via a super-exchange interaction between the Mn atoms through the oxygen atoms . The inter-atomic bond distances and angles can easily be tuned by applying external pressure and can be probed using high resolution X-ray diffraction. Moreover, because the magnetic interaction among the Mn atoms are super-exchange type via Mn-O•••O-Mn path, the variation of the Mn-O•••O-Mn dihedral angle and Mn-O bond distances under high pressure inevitably affects the magnetic properties. Therefore, it is evident that high pressure studies on the magnetically ordered materials would shed light on the interplay between their structural properties and magnetic ordering. This will indeed confirm the role of buckling of the Mn-O polyhedral in understanding the origin of anti-ferromagnetism. In this context, we carried out the pressure dependent X-ray diffraction measurement in a diamond anvil cell (DAC) up to a maximum pressure of 17 GPa to study the phase transition and determine equation of state from the volume compression data. Upon increasing the pressure, we didn’t observe any new diffraction peaks or sudden discontinuity in the pressure dependences of the d values up to the maximum achieved pressure of ~17 GPa. However, it is noticed that beyond 12 GPa the a and b lattice parameters become identical while there is a discontinuity in the β value around the same pressure. This indicates a subtle transition to a pseudo-monoclinic phase. Using the third order Birch-Murnaghan equation of state (EOS) to fit the volume compression data for the entire range, we found the bulk modulus (B0) to be 44 GPa. If we consider the subtle transition at 12 GPa, we tried to fit another equation state for the volume beyond 12 GPa using the second order Birch-Murnaghan EOS. This gives a bulk modulus of ~ 34 GPa for this phase.Keywords: mineral, structural phase transition, high pressure XRD, spectroscopy
Procedia PDF Downloads 872582 Advanced Techniques in Semiconductor Defect Detection: An Overview of Current Technologies and Future Trends
Authors: Zheng Yuxun
Abstract:
This review critically assesses the advancements and prospective developments in defect detection methodologies within the semiconductor industry, an essential domain that significantly affects the operational efficiency and reliability of electronic components. As semiconductor devices continue to decrease in size and increase in complexity, the precision and efficacy of defect detection strategies become increasingly critical. Tracing the evolution from traditional manual inspections to the adoption of advanced technologies employing automated vision systems, artificial intelligence (AI), and machine learning (ML), the paper highlights the significance of precise defect detection in semiconductor manufacturing by discussing various defect types, such as crystallographic errors, surface anomalies, and chemical impurities, which profoundly influence the functionality and durability of semiconductor devices, underscoring the necessity for their precise identification. The narrative transitions to the technological evolution in defect detection, depicting a shift from rudimentary methods like optical microscopy and basic electronic tests to more sophisticated techniques including electron microscopy, X-ray imaging, and infrared spectroscopy. The incorporation of AI and ML marks a pivotal advancement towards more adaptive, accurate, and expedited defect detection mechanisms. The paper addresses current challenges, particularly the constraints imposed by the diminutive scale of contemporary semiconductor devices, the elevated costs associated with advanced imaging technologies, and the demand for rapid processing that aligns with mass production standards. A critical gap is identified between the capabilities of existing technologies and the industry's requirements, especially concerning scalability and processing velocities. Future research directions are proposed to bridge these gaps, suggesting enhancements in the computational efficiency of AI algorithms, the development of novel materials to improve imaging contrast in defect detection, and the seamless integration of these systems into semiconductor production lines. By offering a synthesis of existing technologies and forecasting upcoming trends, this review aims to foster the dialogue and development of more effective defect detection methods, thereby facilitating the production of more dependable and robust semiconductor devices. This thorough analysis not only elucidates the current technological landscape but also paves the way for forthcoming innovations in semiconductor defect detection.Keywords: semiconductor defect detection, artificial intelligence in semiconductor manufacturing, machine learning applications, technological evolution in defect analysis
Procedia PDF Downloads 512581 Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features
Authors: Vesna Kirandziska, Nevena Ackovska, Ana Madevska Bogdanova
Abstract:
The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued.Keywords: emotion recognition, facial recognition, signal processing, machine learning
Procedia PDF Downloads 3162580 Thoughts on the Informatization Technology Innovation of Cores and Samples in China
Authors: Honggang Qu, Rongmei Liu, Bin Wang, Yong Xu, Zhenji Gao
Abstract:
There is a big gap in the ability and level of the informatization technology innovation of cores and samples compared with developed countries. Under the current background of promoting the technology innovation, how to strengthen the informatization technology innovation of cores and samples for National Cores and Samples Archives, which is a national innovation research center, is an important research topic. The paper summarizes the development status of cores and samples informatization technology, and finds the gaps and deficiencies, and proposes the innovation research directions and content, including data extraction, recognition, processing, integration, application and so on, so as to provide some reference and guidance for the future innovation research of the archives and support better the geological technology innovation in China.Keywords: cores and samples;, informatization technology;, innovation;, suggestion
Procedia PDF Downloads 1262579 Adaptation of Hough Transform Algorithm for Text Document Skew Angle Detection
Authors: Kayode A. Olaniyi, Olabanji F. Omotoye, Adeola A. Ogunleye
Abstract:
The skew detection and correction form an important part of digital document analysis. This is because uncompensated skew can deteriorate document features and can complicate further document image processing steps. Efficient text document analysis and digitization can rarely be achieved when a document is skewed even at a small angle. Once the documents have been digitized through the scanning system and binarization also achieved, document skew correction is required before further image analysis. Research efforts have been put in this area with algorithms developed to eliminate document skew. Skew angle correction algorithms can be compared based on performance criteria. Most important performance criteria are accuracy of skew angle detection, range of skew angle for detection, speed of processing the image, computational complexity and consequently memory space used. The standard Hough Transform has successfully been implemented for text documentation skew angle estimation application. However, the standard Hough Transform algorithm level of accuracy depends largely on how much fine the step size for the angle used. This consequently consumes more time and memory space for increase accuracy and, especially where number of pixels is considerable large. Whenever the Hough transform is used, there is always a tradeoff between accuracy and speed. So a more efficient solution is needed that optimizes space as well as time. In this paper, an improved Hough transform (HT) technique that optimizes space as well as time to robustly detect document skew is presented. The modified algorithm of Hough Transform presents solution to the contradiction between the memory space, running time and accuracy. Our algorithm starts with the first step of angle estimation accurate up to zero decimal place using the standard Hough Transform algorithm achieving minimal running time and space but lacks relative accuracy. Then to increase accuracy, suppose estimated angle found using the basic Hough algorithm is x degree, we then run again basic algorithm from range between ±x degrees with accuracy of one decimal place. Same process is iterated till level of desired accuracy is achieved. The procedure of our skew estimation and correction algorithm of text images is implemented using MATLAB. The memory space estimation and process time are also tabulated with skew angle assumption of within 00 and 450. The simulation results which is demonstrated in Matlab show the high performance of our algorithms with less computational time and memory space used in detecting document skew for a variety of documents with different levels of complexity.Keywords: hough-transform, skew-detection, skew-angle, skew-correction, text-document
Procedia PDF Downloads 1592578 Development of Methotrexate Nanostructured Lipid Carriers for Topical Treatment of Psoriasis: Optimization, Evaluation, and in vitro Studies
Authors: Yogeeta O. Agrawal, Hitendra S. Mahajan, Sanjay J. Surana
Abstract:
Methotrexate is effective in controlling recalcitrant psoriasis when administered by the oral or parenteral route long-term. However, the systematic use of this drug may provoke any of a number of side effects, notably hepatotoxic effects. To reduce these effects, clinical studies have been done with topical MTx. It is useful in treating a number of cutaneous conditions, including psoriasis. A major problem in topical administration of MTx currently available in market is that the drug is hydrosoluble and is mostly in the dissociated form at physiological pH. Its capacity for passive diffusion is thus limited. Localization of MTx in effected layers of skin is likely to improve the role of topical dosage form of the drug as a supplementary to oral therapy for treatment of psoriasis. One of the possibilities for increasing the penetration of drugs through the skin is the use of Nanostructured lipid Carriers. The objective of the present study was to formulate and characterize Methotrexate loaded Nanostructured Lipid Carriers (MtxNLCs), to understand in vitro drug release and evaluate the role of the developed gel in the topical treatment of psoriasis. MtxNLCs were prepared by solvent diffusion technique using 3(2) full factorial design.The mean diameter and surface morphology of MtxNLC was evaluated. MtxNLCs were lyophilized and crystallinity of NLC was characterized by Differential Scanning Calorimtery (DSC) and powder X-Ray Diffraction (XRD). The NLCs were incorporated in 1% w/w Carbopol 934 P gel base and in vitro skin deposition studies in Human Cadaver Skin were conducted. The optimized MtxNLCs were spherical in shape, with average particle size of 253(±9.92)nm, zeta potential of -30.4 (±0.86) mV and EE of 53.12(±1.54)%. DSC and XRD data confirmed the formation of NLCs. Significantly higher deposition of Methotrexate was found in human cadaver skin from MtxNLC gel (71.52 ±1.23%) as compared to Mtx plain gel (54.28±1.02%). Findings of the studies suggest that there is significant improvement in therapeutic index in treatment of psoriasis by MTx-NLCs incorporated gel base developed in this investigation over plain drug gel currently available in the market.Keywords: methotrexate, psoriasis, NLCs, hepatotoxic effects
Procedia PDF Downloads 4302577 Challenges in Video Based Object Detection in Maritime Scenario Using Computer Vision
Authors: Dilip K. Prasad, C. Krishna Prasath, Deepu Rajan, Lily Rachmawati, Eshan Rajabally, Chai Quek
Abstract:
This paper discusses the technical challenges in maritime image processing and machine vision problems for video streams generated by cameras. Even well documented problems of horizon detection and registration of frames in a video are very challenging in maritime scenarios. More advanced problems of background subtraction and object detection in video streams are very challenging. Challenges arising from the dynamic nature of the background, unavailability of static cues, presence of small objects at distant backgrounds, illumination effects, all contribute to the challenges as discussed here.Keywords: autonomous maritime vehicle, object detection, situation awareness, tracking
Procedia PDF Downloads 4582576 Study of Error Analysis and Sources of Uncertainty in the Measurement of Residual Stresses by the X-Ray Diffraction
Authors: E. T. Carvalho Filho, J. T. N. Medeiros, L. G. Martinez
Abstract:
Residual stresses are self equilibrating in a rigid body that acts on the microstructure of the material without application of an external load. They are elastic stresses and can be induced by mechanical, thermal and chemical processes causing a deformation gradient in the crystal lattice favoring premature failure in mechanicals components. The search for measurements with good reliability has been of great importance for the manufacturing industries. Several methods are able to quantify these stresses according to physical principles and the response of the mechanical behavior of the material. The diffraction X-ray technique is one of the most sensitive techniques for small variations of the crystalline lattice since the X-ray beam interacts with the interplanar distance. Being very sensitive technique is also susceptible to variations in measurements requiring a study of the factors that influence the final result of the measurement. Instrumental, operational factors, form deviations of the samples and geometry of analyzes are some variables that need to be considered and analyzed in order for the true measurement. The aim of this work is to analyze the sources of errors inherent to the residual stress measurement process by X-ray diffraction technique making an interlaboratory comparison to verify the reproducibility of the measurements. In this work, two specimens were machined, differing from each other by the surface finishing: grinding and polishing. Additionally, iron powder with particle size less than 45 µm was selected in order to be a reference (as recommended by ASTM E915 standard) for the tests. To verify the deviations caused by the equipment, those specimens were positioned and with the same analysis condition, seven measurements were carried out at 11Ψ tilts. To verify sample positioning errors, seven measurements were performed by positioning the sample at each measurement. To check geometry errors, measurements were repeated for the geometry and Bragg Brentano parallel beams. In order to verify the reproducibility of the method, the measurements were performed in two different laboratories and equipments. The results were statistically worked out and the quantification of the errors.Keywords: residual stress, x-ray diffraction, repeatability, reproducibility, error analysis
Procedia PDF Downloads 1812575 Practical Guide To Design Dynamic Block-Type Shallow Foundation Supporting Vibrating Machine
Authors: Dodi Ikhsanshaleh
Abstract:
When subjected to dynamic load, foundation oscillates in the way that depends on the soil behaviour, the geometry and inertia of the foundation and the dynamic exctation. The practical guideline to analysis block-type foundation excitated by dynamic load from vibrating machine is presented. The analysis use Lumped Mass Parameter Method to express dynamic properties such as stiffness and damping of soil. The numerical examples are performed on design block-type foundation supporting gas turbine compressor which is important equipment package in gas processing plantKeywords: block foundation, dynamic load, lumped mass parameter
Procedia PDF Downloads 4902574 An Eigen-Approach for Estimating the Direction-of Arrival of Unknown Number of Signals
Authors: Dia I. Abu-Al-Nadi, M. J. Mismar, T. H. Ismail
Abstract:
A technique for estimating the direction-of-arrival (DOA) of unknown number of source signals is presented using the eigen-approach. The eigenvector corresponding to the minimum eigenvalue of the autocorrelation matrix yields the minimum output power of the array. Also, the array polynomial with this eigenvector possesses roots on the unit circle. Therefore, the pseudo-spectrum is found by perturbing the phases of the roots one by one and calculating the corresponding array output power. The results indicate that the DOAs and the number of source signals are estimated accurately in the presence of a wide range of input noise levels.Keywords: array signal processing, direction-of-arrival, antenna arrays, Eigenvalues, Eigenvectors, Lagrange multiplier
Procedia PDF Downloads 3342573 A Case Study of Limited Dynamic Voltage Frequency Scaling in Low-Power Processors
Authors: Hwan Su Jung, Ahn Jun Gil, Jong Tae Kim
Abstract:
Power management techniques are necessary to save power in the microprocessor. By changing the frequency and/or operating voltage of processor, DVFS can control power consumption. In this paper, we perform a case study to find optimal power state transition for DVFS. We propose the equation to find the optimal ratio between executions of states while taking into account the deadline of processing time and the power state transition delay overhead. The experiment is performed on the Cortex-M4 processor, and average 6.5% power saving is observed when DVFS is applied under the deadline condition.Keywords: deadline, dynamic voltage frequency scaling, power state transition
Procedia PDF Downloads 4562572 Fabrication of ZnO Nanorods Based Biosensor via Hydrothermal Method
Authors: Muhammad Tariq, Jafar Khan Kasi, Samiullah, Ajab Khan Kasi
Abstract:
Biosensors are playing vital role in industrial, clinical, and chemical analysis applications. Among other techniques, ZnO based biosensor is an easy approach due to its exceptional chemical and electrical properties. ZnO nanorods have positively charged isoelectric point which helps immobilize the negative charge glucose oxides (GOx). Here, we report ZnO nanorods based biosensors for the immobilization of GOx. The ZnO nanorods were grown by hydrothermal method on indium tin oxide substrate (ITO). The fabrication of biosensors was carried through batch processing using conventional photolithography. The buffer solutions of GOx were prepared in phosphate with a pH value of around 7.3. The biosensors effectively immobilized the GOx and result was analyzed by calculation of voltage and current on nanostructures.Keywords: hydrothermal growth, sol-gel, zinc dioxide, biosensors
Procedia PDF Downloads 3012571 Thermal Stability and Electrical Conductivity of Ca₅Mg₄₋ₓMₓ(VO₄)₆ (0 ≤ x ≤ 4) where M = Zn, Ni Measured by Impedance Spectroscopy
Authors: Anna S. Tolkacheva, Sergey N. Shkerin, Kirill G. Zemlyanoi, Olga G. Reznitskikh, Pavel D. Khavlyuk
Abstract:
Calcium oxovanadates with garnet related structure are multifunctional oxides in various fields like photoluminescence, microwave dielectrics, and magneto-dielectrics. For example, vanadate garnets are self-luminescent compounds. They attract attention as RE-free broadband excitation and emission phosphors and are candidate materials for UV-based white light-emitting diodes (WLEDs). Ca₅M₄(VO₄)₆ (M = Mg, Zn, Co, Ni, Mn) compounds are also considered promising for application in microwave devices as substrate materials. However, the relation between their structure, composition and physical/chemical properties remains unclear. Given the above-listed observations, goals of this study are to synthesise Ca₅M₄(VO₄)₆ (M = Mg, Zn, Ni) and to study their thermal and electrical properties. Solid solutions Ca₅Mg₄₋ₓMₓ(VO₄)₆ (0 ≤ x ≤ 4) where M is Zn and Ni have been synthesized by sol-gel method. The single-phase character of the final products was checked by powder X-ray diffraction on a Rigaku D/MAX-2200 X-ray diffractometer using Cu Kα radiation in the 2θ range from 15° to 70°. The dependence of thermal properties on chemical composition of solid solutions was studied using simultaneous thermal analyses (DSC and TG). Thermal analyses were conducted in a Netzch simultaneous analyser STA 449C Jupiter, in Ar atmosphere, in temperature range from 25 to 1100°C heat rate was 10 K·min⁻¹. Coefficients of thermal expansion (CTE) were obtained by dilatometry measurements in air up to 800°C using a Netzsch 402PC dilatometer; heat rate was 1 K·min⁻¹. Impedance spectra were obtained via the two-probe technique with an impedance meter Parstat 2273 in air up to 700°C with the variation of pH₂O from 0.04 to 3.35 kPa. Cation deficiency in Ca and Mg sublattice under the substitution of MgO with ZnO up to 1/6 was observed using Rietveld refinement of the crystal structure. Melting point was found to decrease with x changing from 0 to 4 in Ca₅Mg₄₋ₓMₓ(VO₄)₆ where M is Zn and Ni. It was observed that electrical conductivity does not depend on air humidity. The reported study was funded by the RFBR Grant No. 17–03–01280. Sample attestation was carried out in the Shared Access Centers at the IHTE UB RAS.Keywords: garnet structure, electrical conductivity, thermal expansion, thermal properties
Procedia PDF Downloads 1552570 Application of Optical Method Based on Laser Devise as Non-Destructive Testing for Calculus of Mechanical Deformation
Authors: R. Daïra, V. Chalvidan
Abstract:
We present the speckle interferometry method to determine the deformation of a piece. This method of holographic imaging using a CCD camera for simultaneous digital recording of two states object and reference. The reconstruction is obtained numerically. This latest method has the advantage of being simpler than the methods currently available, and it does not suffer the holographic configuration faults online. Furthermore, it is entirely digital and avoids heavy analysis after recording the hologram. This work was carried out in the laboratory HOLO 3 (optical metrology laboratory in Saint Louis, France) and it consists in controlling qualitatively and quantitatively the deformation of object by using a camera CCD connected to a computer equipped with software of Fringe Analysis.Keywords: speckle, nondestructive testing, interferometry, image processing
Procedia PDF Downloads 4972569 Joint Simulation and Estimation for Geometallurgical Modeling of Crushing Consumption Energy in the Mineral Processing Plants
Authors: Farzaneh Khorram, Xavier Emery
Abstract:
In this paper, it is aimed to create a crushing consumption energy (CCE) block model and determine the blocks with the potential to have the maximum grinding process energy consumption for the study area. For this purpose, a joint estimate (co-kriging) and joint simulation (turning band method and plurigaussian methods) to predict the CCE based on its correlation with SAG power index (SPI), A×B, and ball mill bond work Index (BWI). The analysis shows that TBCOSIM and plurigaussian have the more realistic results compared to cokriging. It seems logical due to the nature of the data geometallurgical and the linearity of the kriging method and the smoothing effect of kriging.Keywords: plurigaussian, turning band, cokriging, geometallurgy
Procedia PDF Downloads 702568 Interactive, Topic-Oriented Search Support by a Centroid-Based Text Categorisation
Authors: Mario Kubek, Herwig Unger
Abstract:
Centroid terms are single words that semantically and topically characterise text documents and so may serve as their very compact representation in automatic text processing. In the present paper, centroids are used to measure the relevance of text documents with respect to a given search query. Thus, a new graphbased paradigm for searching texts in large corpora is proposed and evaluated against keyword-based methods. The first, promising experimental results demonstrate the usefulness of the centroid-based search procedure. It is shown that especially the routing of search queries in interactive and decentralised search systems can be greatly improved by applying this approach. A detailed discussion on further fields of its application completes this contribution.Keywords: search algorithm, centroid, query, keyword, co-occurrence, categorisation
Procedia PDF Downloads 2822567 Influence of Chemical Treatment on Elastic Properties of the Band Cotton Crepe 100%
Authors: Bachir Chemani, Rachid Halfaoui, Madani Maalem
Abstract:
The manufacturing technology of band cotton is very delicate and depends to choice of certain parameters such as torsion of warp yarn. The fabric elasticity is achieved without the use of any elastic material, chemical expansion, artificial or synthetic and it’s capable of creating pressures useful for therapeutic treatments.Before use, the band is subjected to treatments of specific preparation for obtaining certain elasticity, however, during its treatment, there are some regression parameters. The dependence of manufacturing parameters on the quality of the chemical treatment was confirmed. The aim of this work is to improve the properties of the fabric through the development of manufacturing technology appropriately. Finally for the treatment of the strip pancake 100% cotton, a treatment method is recommended.Keywords: elastic, cotton, processing, torsion
Procedia PDF Downloads 3872566 Effect of Carbon Nanotubes on Nanocomposite from Nanofibrillated Cellulose
Authors: M. Z. Shazana, R. Rosazley, M. A. Izzati, A. W. Fareezal, I. Rushdan, A. B. Suriani, S. Zakaria
Abstract:
There is an increasing interest in the development of flexible energy storage for application of Carbon Nanotubes and nanofibrillated cellulose (NFC). In this study, nanocomposite is consisting of Carbon Nanotube (CNT) mixed with suspension of nanofibrillated cellulose (NFC) from Oil Palm Empty Fruit Bunch (OPEFB). The use of Carbon Nanotube (CNT) as additive nanocomposite was improved the conductivity and mechanical properties of nanocomposite from nanofibrillated cellulose (NFC). The nanocomposite were characterized for electrical conductivity and mechanical properties in uniaxial tension, which were tensile to measure the bond of fibers in nanocomposite. The processing route is environmental friendly which leads to well-mixed structures and good results as well.Keywords: carbon nanotube (CNT), nanofibrillated cellulose (NFC), mechanical properties, electrical conductivity
Procedia PDF Downloads 3342565 Slow Pyrolysis of Bio-Wastes: Environmental, Exergetic, and Energetic (3E) Assessment
Authors: Daniela Zalazar-Garcia, Erick Torres, German Mazza
Abstract:
Slow pyrolysis of a pellet of pistachio waste was studied using a lab-scale stainless-steel reactor. Experiments were conducted at different heating rates (5, 10, and 15 K/min). A 3-E (environmental, exergetic, and energetic) analysis for the processing of 20 kg/h of bio-waste was carried out. Experimental results showed that biochar and gas yields decreased with an increase in the heating rate (43 to 36 % and 28 to 24 %, respectively), while the bio-oil yield increased (29 to 40 %). Finally, from the 3-E analysis and the experimental results, it can be suggested that an increase in the heating rate resulted in a higher pyrolysis exergetic efficiency (70 %) due to an increase of the bio-oil yield with high-energy content.Keywords: 3E assessment, bio-waste pellet, life cycle assessment, slow pyrolysis
Procedia PDF Downloads 2212564 Synergistic Anti-Proliferation Effect of PLK-1 Inhibitor and Livistona Chinensis Fruit Extracts on Lung Adenocarcinoma A549 Cells
Authors: Min-Chien Su, Tzu-Hsuan Hsu, Guan-Xuan Wu, Shyh-Ming Kuo
Abstract:
Lung cancer is one of the clinically challenging malignant diseases worldwide. For efficient therapeutics in cancer, combination therapy has developed to acquire a better outcome. PLK-1 was one of the major factors affecting cell mitosis in cancer cells, its inhibitor Bi6727 was proven effective in treating several different cancers namely oral cancer, colon cancer and lung cancer. Despite its low toxicity toward normal cells compared to traditional chemotherapy, it is still yet to be evaluated in detail. Livistona Chinensis (LC) is a Chinese herb that used as a traditional prescription to treat lung cancer. Due to the uncertainty of the efficacy of LC, we utilized a water extraction method to extract the Livistona Chinensis and then lyophilized into powder for further study. In this study we investigated the antiproliferation activities of Bi6727 and LC extracts (LCE) on A549 non-small lung cancer cells. The IC50 of Bi6727 and LCE on A549 are 60 nM and 0.8 mg/mL, respectively. The fluorescent staining images shown nucleolus damage in cells treated with Bi6727 and mitochondrial damage after treated with LCE. A549 cells treated with Bi6727 and LCE showed increased expression of Bax, Caspase-3 and Caspase-9 proteins from Western blot assay. LCE also inhibited A549 cells growth keeping cells at G2-M phase from cell cycle assay. Apoptosis assay results showed that LCE induced late apoptosis of A549 cells. JC-1 assay showed that the mitochondria damaged at the LCE concentration of 0.4 mg/mL. In our preliminary anti-proliferation test of combined LCE and Bi-6727 on A549 cells, we found a dramatically decrease in proliferation after treated with LCE first for 24-h and then Bi-6727 for extra 24-h. This was an important finding regarding synergistic anti-proliferation effect of these drugs, However, the usage, the application sequence of LCE and Bi-6727 on A549 cells and their related mechanisms still need to be evaluated. In summary, the drugs exerted anti-proliferation effect on A549 cells independently. We hopefully combine the usage of these two drugs will bring a different and potential outcome in treating lung cancer.Keywords: anti-proliferation, A549, Livistona Chinensis fruit extracts, PLK-1 inhibitor
Procedia PDF Downloads 1412563 Knowledge Graph Development to Connect Earth Metadata and Standard English Queries
Authors: Gabriel Montague, Max Vilgalys, Catherine H. Crawford, Jorge Ortiz, Dava Newman
Abstract:
There has never been so much publicly accessible atmospheric and environmental data. The possibilities of these data are exciting, but the sheer volume of available datasets represents a new challenge for researchers. The task of identifying and working with a new dataset has become more difficult with the amount and variety of available data. Datasets are often documented in ways that differ substantially from the common English used to describe the same topics. This presents a barrier not only for new scientists, but for researchers looking to find comparisons across multiple datasets or specialists from other disciplines hoping to collaborate. This paper proposes a method for addressing this obstacle: creating a knowledge graph to bridge the gap between everyday English language and the technical language surrounding these datasets. Knowledge graph generation is already a well-established field, although there are some unique challenges posed by working with Earth data. One is the sheer size of the databases – it would be infeasible to replicate or analyze all the data stored by an organization like The National Aeronautics and Space Administration (NASA) or the European Space Agency. Instead, this approach identifies topics from metadata available for datasets in NASA’s Earthdata database, which can then be used to directly request and access the raw data from NASA. By starting with a single metadata standard, this paper establishes an approach that can be generalized to different databases, but leaves the challenge of metadata harmonization for future work. Topics generated from the metadata are then linked to topics from a collection of English queries through a variety of standard and custom natural language processing (NLP) methods. The results from this method are then compared to a baseline of elastic search applied to the metadata. This comparison shows the benefits of the proposed knowledge graph system over existing methods, particularly in interpreting natural language queries and interpreting topics in metadata. For the research community, this work introduces an application of NLP to the ecological and environmental sciences, expanding the possibilities of how machine learning can be applied in this discipline. But perhaps more importantly, it establishes the foundation for a platform that can enable common English to access knowledge that previously required considerable effort and experience. By making this public data accessible to the full public, this work has the potential to transform environmental understanding, engagement, and action.Keywords: earth metadata, knowledge graphs, natural language processing, question-answer systems
Procedia PDF Downloads 1482562 Landslide Hazard Zonation Using Satellite Remote Sensing and GIS Technology
Authors: Ankit Tyagi, Reet Kamal Tiwari, Naveen James
Abstract:
Landslide is the major geo-environmental problem of Himalaya because of high ridges, steep slopes, deep valleys, and complex system of streams. They are mainly triggered by rainfall and earthquake and causing severe damage to life and property. In Uttarakhand, the Tehri reservoir rim area, which is situated in the lesser Himalaya of Garhwal hills, was selected for landslide hazard zonation (LHZ). The study utilized different types of data, including geological maps, topographic maps from the survey of India, Landsat 8, and Cartosat DEM data. This paper presents the use of a weighted overlay method in LHZ using fourteen causative factors. The various data layers generated and co-registered were slope, aspect, relative relief, soil cover, intensity of rainfall, seismic ground shaking, seismic amplification at surface level, lithology, land use/land cover (LULC), normalized difference vegetation index (NDVI), topographic wetness index (TWI), stream power index (SPI), drainage buffer and reservoir buffer. Seismic analysis is performed using peak horizontal acceleration (PHA) intensity and amplification factors in the evaluation of the landslide hazard index (LHI). Several digital image processing techniques such as topographic correction, NDVI, and supervised classification were widely used in the process of terrain factor extraction. Lithological features, LULC, drainage pattern, lineaments, and structural features are extracted using digital image processing techniques. Colour, tones, topography, and stream drainage pattern from the imageries are used to analyse geological features. Slope map, aspect map, relative relief are created by using Cartosat DEM data. DEM data is also used for the detailed drainage analysis, which includes TWI, SPI, drainage buffer, and reservoir buffer. In the weighted overlay method, the comparative importance of several causative factors obtained from experience. In this method, after multiplying the influence factor with the corresponding rating of a particular class, it is reclassified, and the LHZ map is prepared. Further, based on the land-use map developed from remote sensing images, a landslide vulnerability study for the study area is carried out and presented in this paper.Keywords: weighted overlay method, GIS, landslide hazard zonation, remote sensing
Procedia PDF Downloads 133