Search results for: extraction tool
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6727

Search results for: extraction tool

5737 Cardiovascular Modeling Software Tools in Medicine

Authors: J. Fernandez, R. Fernandez de Canete, J. Perea-Paizal, J. C. Ramos-Diaz

Abstract:

The high prevalence of cardiovascular diseases has provoked a raising interest in the development of mathematical models in order to evaluate the cardiovascular function both under physiological and pathological conditions. In this paper, a physical model of the cardiovascular system with intrinsic regulation is presented and implemented by using the object-oriented Modelica simulation software tools.  For this task, a multi-compartmental system previously validated with physiological data has been built, based on the interconnection of cardiovascular elements such as resistances, capacitances and pumping among others, by following an electrohydraulic analogy. The results obtained under both physiological and pathological scenarios provide an easy interpretative key to analyze the hemodynamic behavior of the patient. The described approach represents a valuable tool in the teaching of physiology for graduate medical and nursing students among others.

Keywords: cardiovascular system, MODELICA simulation software, physical modelling, teaching tool

Procedia PDF Downloads 296
5736 Effects of Different Mechanical Treatments on the Physical and Chemical Properties of Turmeric

Authors: Serpa A. M., Gómez Hoyos C., Velásquez-Cock J. A., Ruiz L. F., Vélez Acosta L. M., Gañan P., Zuluaga R.

Abstract:

Turmeric (Curcuma Longa L) is an Indian rhizome known for its biological properties, derived from its active compounds such as curcuminoids. Curcumin, the main polyphenol in turmeric, only represents around 3.5% of the dehydrated rhizome and extraction yields between 41 and 90% have been reported. Therefore, for every 1000 tons of turmeric powder used for the extraction of curcumin, around 970 tons of residues are generated. The present study evaluates the effect of different mechanical treatments (waring blender, grinder and high-pressure homogenization) on the physical and chemical properties of turmeric, as an alternative for the transformation of the entire rhizome. Suspensions of turmeric (10, 20 y 30%) were processed by waring blender during 3 min at 12000 rpm, while the samples treated by grinder were processed evaluating two different Gaps (-1 and -1,5). Finally, the process by high-pressure homogenization, was carried out at 500 bar. According to the results, the luminosity of the samples increases with the severity of the mechanical treatment, due to the stabilization of the color associated with the inactivation of the oxidative enzymes. Additionally, according to the microstructure of the samples, the process by grinder (Gap -1,5) and by high-pressure homogenization allowed the largest size reduction, reaching sizes up to 3 m (measured by optical microscopy). This processes disrupts the cells and breaks their fragments into small suspended particles. The infrared spectra obtained from the samples using an attenuated total reflectance accessory indicates changes in the 800-1200 cm⁻¹ region, related mainly to changes in the starch structure. Finally, the thermogravimetric analysis shows the presence of starch, curcumin and some minerals in the suspensions.

Keywords: characterization, mechanical treatments, suspensions, turmeric rhizome

Procedia PDF Downloads 159
5735 Railway Transport as a Potential Source of Polychlorinated Biphenyls in Soil

Authors: Nataša Stojić, Mira Pucarević, Nebojša Ralević, Vojislava Bursić, Gordan Stojić

Abstract:

Surface soil (0 – 10 cm) samples from 52 sampling sites along the length of railway tracks on the territory of Srem (the western part of the Autonomous Province of Vojvodina, itself part of Serbia) were collected and analyzed for 7 polychlorinated biphenyls (PCBs) in order to see how the distance from the railroad on the one hand and dump on the other hand, affect the concentration of PCBs (CPCBs) in the soil. Samples were taken at a distance of 0.03 to 4.19 km from the railway and 0.43 to 3.35 km from the landfills. For the soil extraction the Soxhlet extraction (USEPA 3540S) was used. The extracts were purified on a silica-gel column (USEPA 3630C). The analysis of the extracts was performed by gas chromatography with tandem mass spectrometry. PCBs were not detected only at two locations. Mean total concentration of PCBs for all other sampling locations was 0,0043 ppm dry weight (dw) with a range of 0,0005 to 0,0227 ppm dw. On the part of the data that were interesting for this research with statistical methods (PCA) were isolated factors that affect the concentration of PCBs. Data were also analyzed using the Pearson's chi-squared test which showed that the hypothesis of independence of CPCBs and distance from the railway can be rejected. Hypothesis of independence between CPCB and the percentage of humus in the soil can also be rejected, in contrast to dependence of CPCB and the distance from the landfill where the hypothesis of independence cannot be rejected. Based on these results can be said that railway transport is a potential source of PCBs. The next step in this research is to establish the position of transformers which are located near sampling sites as another important factor that affects the concentration of PCBs in the soil.

Keywords: GC/MS, landfill, PCB, railway, soil

Procedia PDF Downloads 328
5734 Semantic Indexing Improvement for Textual Documents: Contribution of Classification by Fuzzy Association Rules

Authors: Mohsen Maraoui

Abstract:

In the aim of natural language processing applications improvement, such as information retrieval, machine translation, lexical disambiguation, we focus on statistical approach to semantic indexing for multilingual text documents based on conceptual network formalism. We propose to use this formalism as an indexing language to represent the descriptive concepts and their weighting. These concepts represent the content of the document. Our contribution is based on two steps. In the first step, we propose the extraction of index terms using the multilingual lexical resource Euro WordNet (EWN). In the second step, we pass from the representation of index terms to the representation of index concepts through conceptual network formalism. This network is generated using the EWN resource and pass by a classification step based on association rules model (in attempt to discover the non-taxonomic relations or contextual relations between the concepts of a document). These relations are latent relations buried in the text and carried by the semantic context of the co-occurrence of concepts in the document. Our proposed indexing approach can be applied to text documents in various languages because it is based on a linguistic method adapted to the language through a multilingual thesaurus. Next, we apply the same statistical process regardless of the language in order to extract the significant concepts and their associated weights. We prove that the proposed indexing approach provides encouraging results.

Keywords: concept extraction, conceptual network formalism, fuzzy association rules, multilingual thesaurus, semantic indexing

Procedia PDF Downloads 135
5733 Application of Aquatic Plants for the Remediation of Organochlorine Pesticides from Keenjhar Lake

Authors: Soomal Hamza, Uzma Imran

Abstract:

Organochlorine pesticides bio-accumulate into the fat of fish, birds, and animals through which it enters the human food cycle. Due to their persistence and stability in the environment, many health impacts are associated with them, most of which are carcinogenic in nature. In this study, the level of organochlorine pesticides has been detected in Keenjhar Lake and remediated using Rhizoremediation technique. 14 OC pesticides namely, Aldrin, Deldrin, Heptachlor, Heptachlor epoxide, Endrin, Endosulfun I and II, DDT, DDE, DDD, Alpha, Beta, Gamma BHC and two plants namely, Water Hyacinth and Slvinia Molesta were used in the system using pot experiment which processed for 11 days. A consortium was inoculated in both plants to increase its efficiency. Water samples were processed using liquide-liquid extraction. Sediments and roots samples were processed using Soxhlet method followed by clean-up and Gas Chromatography. Delta-BHC was the predominantly found in all samples with mean concentration (ppb) and standard deviation of 0.02 ± 0.14, 0.52 ± 0.68, 0.61 ± 0.06, in Water, Sediments and Roots samples respectively. The highest levels were of Endosulfan II in the samples of water, sediments and roots. Water Hyacinth proved to be better bioaccumulaor as compared to Silvinia Molesta. The pattern of compounds reduction rate by the end of experiment was Delta-BHC>DDD > Alpha-BHC > DDT> Heptachlor> H.Epoxide> Deldrin> Aldrin> Endrin> DDE> Endosulfun I > Endosulfun II. Not much significant difference was observed between the pots with the consortium and pots without the consortium addition. Phytoremediation is a promising technique, but more studies are required to assess the bioremediation potential of different aquatic plants and plant-endophyte relationship.

Keywords: aquatic plant, bio remediation, gas chromatography, liquid liquid extraction

Procedia PDF Downloads 140
5732 The Study of Spray Drying Process for Skimmed Coconut Milk

Authors: Jaruwan Duangchuen, Siwalak Pathaveerat

Abstract:

Coconut (Cocos nucifera) belongs to the family Arecaceae. Coconut juice and meat are consumed as food and dessert in several regions of the world. Coconut juice contains low proteins, and arginine is the main amino acid content. Coconut meat is the endosperm of coconut that has nutritional value. It composes of carbohydrate, protein and fat. The objective of this study is utilization of by-products from the virgin coconut oil extraction process by using the skimmed coconut milk as a powder. The skimmed coconut milk was separated from the coconut milk in virgin coconut oil extraction process that consists approximately of protein 6.4%, carbohydrate 7.2%, dietary fiber 0.27 %, sugar 6.27%, fat 3.6 % and moisture content of 86.93%. This skimmed coconut milk can be made to powder for value - added product by using spray drying. The factors effect to the yield and properties of dry skimmed coconut milk in spraying process are inlet, outlet air temperature and the maltodextrin concentration. The percentage of maltodextrin content (15, 20%), outlet air temperature (80 ºC, 85 ºC, 90 ºC) and inlet air temperature (190 ºC, 200 ºC, 210 ºC) were conducted to the skimmed coconut milk spray drying process. The spray dryer was kept air flow rate (0.2698 m3 /s). The result that shown 2.22 -3.23% of moisture content, solubility, bulk density (0.4-0.67g/mL), solubility, wettability (4.04 -19.25 min) for solubility in the water, color, particle size were analyzed for the powder samples. The maximum yield (18.00%) of spray dried coconut milk powder was obtained at 210 °C of temperature, 80°C of outlet temperature and 20% maltodextrin for 27.27 second for drying time. For the amino analysis shown that the high amino acids are Glutamine (16.28%), Arginine (10.32%) and Glycerin (9.59%) by using HPLP method (UV detector).

Keywords: skimmed coconut milk, spray drying, virgin coconut oil process (VCO), maltodextrin

Procedia PDF Downloads 325
5731 Application Quality Function Deployment (QFD) Tool in Design of Aero Pumps Based on System Engineering

Authors: Z. Soleymani, M. Amirzadeh

Abstract:

Quality Function Deployment (QFD) was developed in 1960 in Japan and introduced in 1983 in America and Europe. The paper presents a real application of this technique in a way that the method of applying QFD in design and production aero fuel pumps has been considered. While designing a product and in order to apply system engineering process, the first step is identification customer needs then its transition to engineering parameters. Since each change in deign after production process leads to extra human costs and also increase in products quality risk, QFD can make benefits in sale by meeting customer expectations. Since the needs identified as well, the use of QFD tool can lead to increase in communications and less deviation in design and production phases, finally it leads to produce the products with defined technical attributes.

Keywords: customer voice, engineering parameters, gear pump, QFD

Procedia PDF Downloads 244
5730 An Automatic Speech Recognition Tool for the Filipino Language Using the HTK System

Authors: John Lorenzo Bautista, Yoon-Joong Kim

Abstract:

This paper presents the development of a Filipino speech recognition tool using the HTK System. The system was trained from a subset of the Filipino Speech Corpus developed by the DSP Laboratory of the University of the Philippines-Diliman. The speech corpus was both used in training and testing the system by estimating the parameters for phonetic HMM-based (Hidden-Markov Model) acoustic models. Experiments on different mixture-weights were incorporated in the study. The phoneme-level word-based recognition of a 5-state HMM resulted in an average accuracy rate of 80.13 for a single-Gaussian mixture model, 81.13 after implementing a phoneme-alignment, and 87.19 for the increased Gaussian-mixture weight model. The highest accuracy rate of 88.70% was obtained from a 5-state model with 6 Gaussian mixtures.

Keywords: Filipino language, Hidden Markov Model, HTK system, speech recognition

Procedia PDF Downloads 476
5729 A Questionnaire-Based Survey: Therapists Response towards Upper Limb Disorder Learning Tool

Authors: Noor Ayuni Che Zakaria, Takashi Komeda, Cheng Yee Low, Kaoru Inoue, Fazah Akhtar Hanapiah

Abstract:

Previous studies have shown that there are arguments regarding the reliability and validity of the Ashworth and Modified Ashworth Scale towards evaluating patients diagnosed with upper limb disorders. These evaluations depended on the raters’ experiences. This initiated us to develop an upper limb disorder part-task trainer that is able to simulate consistent upper limb disorders, such as spasticity and rigidity signs, based on the Modified Ashworth Scale to improve the variability occurring between raters and intra-raters themselves. By providing consistent signs, novice therapists would be able to increase training frequency and exposure towards various levels of signs. A total of 22 physiotherapists and occupational therapists participated in the study. The majority of the therapists agreed that with current therapy education, they still face problems with inter-raters and intra-raters variability (strongly agree 54%; n = 12/22, agree 27%; n = 6/22) in evaluating patients’ conditions. The therapists strongly agreed (72%; n = 16/22) that therapy trainees needed to increase their frequency of training; therefore believe that our initiative to develop an upper limb disorder training tool will help in improving the clinical education field (strongly agree and agree 63%; n = 14/22).

Keywords: upper limb disorder, clinical education tool, inter/intra-raters variability, spasticity, modified Ashworth scale

Procedia PDF Downloads 307
5728 The Implementation of a Nurse-Driven Palliative Care Trigger Tool

Authors: Sawyer Spurry

Abstract:

Problem: Palliative care providers at an academic medical center in Maryland stated medical intensive care unit (MICU) patients are often referred late in their hospital stay. The MICU has performed well below the hospital quality performance metric of 80% of patients who expire with expected outcomes should have received a palliative care consult within 48 hours of admission. Purpose: The purpose of this quality improvement (QI) project is to increase palliative care utilization in the MICU through the implementation of a Nurse-Driven PalliativeTriggerTool to prompt the need for specialty palliative care consult. Methods: MICU nursing staff and providers received education concerning the implications of underused palliative care services and the literature data supporting the use of nurse-driven palliative care tools as a means of increasing utilization of palliative care. A MICU population specific criteria of palliative triggers (Palliative Care Trigger Tool) was formulated by the QI implementation team, palliative care team, and patient care services department. Nursing staff were asked to assess patients daily for the presence of palliative triggers using the Palliative Care Trigger Tool and present findings during bedside rounds. MICU providers were asked to consult palliative medicinegiven the presence of palliative triggers; following interdisciplinary rounds. Rates of palliative consult, given the presence of triggers, were collected via electronic medical record e-data pull, de-identified, and recorded in the data collection tool. Preliminary Results: Over 140 MICU registered nurses were educated on the palliative trigger initiative along with 8 nurse practitioners, 4 intensivists, 2 pulmonary critical care fellows, and 2 palliative medicine physicians. Over 200 patients were admitted to the MICU and screened for palliative triggers during the 15-week implementation period. Primary outcomes showed an increase in palliative care consult rates to those patients presenting with triggers, a decreased mean time from admission to palliative consult, and increased recognition of unmet palliative care needs by MICU nurses and providers. Conclusions: Anticipatory findings of this QI project would suggest a positive correlation between utilizing palliative care trigger criteria and decreased time to palliative care consult. The direct outcomes of effective palliative care results in decreased length of stay, healthcare costs, and moral distress, as well as improved symptom management and quality of life (QOL).

Keywords: palliative care, nursing, quality improvement, trigger tool

Procedia PDF Downloads 187
5727 Code – Switching in a Flipped Classroom for Foreign Students

Authors: E. Tutova, Y. Ebzeeva, L. Gishkaeva, Y.Smirnova, N. Dubinina

Abstract:

We have been working with students from different countries and found it crucial to switch the languages to explain something. Whether it is Russian, or Chinese, explaining in a different language plays an important role for students’ cognitive abilities. In this work we are going to explore how code switching may impact the student’s perception of information. Code-switching is a tool defined by linguists as a switch from one language to another for convenience, explanation of terms unavailable in an initial language or sometimes prestige. In our case, we are going to consider code-switching from the function of convenience. As a rule, students who come to study Russian in a language environment, lack many skills in speaking the language. Thus, it is made harder to explain the rules for them of another language, which is English. That is why switching between English, Russian and Mandarin is crucial for their better understanding. In this work we are going to explore the code-switching as a tool which can help a teacher in a flipped classroom.

Keywords: bilingualism, psychological linguistics, code-switching, social linguistics

Procedia PDF Downloads 75
5726 Effect of Solvents in the Extraction and Stability of Anthocyanin from the Petals of Caesalpinia pulcherrima for Natural Dye-Sensitized Solar Cell

Authors: N. Prabavathy, R. Balasundaraprabhu, S. Shalini, Dhayalan Velauthapillai, S. Prasanna, N. Muthukumarasamy

Abstract:

Dye sensitized solar cell (DSSC) has become a significant research area due to their fundamental and scientific importance in the area of energy conversion. Synthetic dyes as sensitizer in DSSC are efficient and durable but they are costlier, toxic and have the tendency to degrade. Natural sensitizers contain plant pigments such as anthocyanin, carotenoid, flavonoid, and chlorophyll which promote light absorption as well as injection of charges to the conduction band of TiO2 through the sensitizer. But, the efficiency of natural dyes is not up to the mark mainly due to instability of the pigment such as anthocyanin. The stability issues in vitro are mainly due to the effect of solvents on extraction of anthocyanins and their respective pH. Taking this factor into consideration, in the present work, the anthocyanins were extracted from the flower Caesalpinia pulcherrima (C. pulcherrimma) with various solvents and their respective stability and pH values are discussed. The usage of citric acid as solvent to extract anthocyanin has shown good stability than other solvents. It also helps in enhancing the sensitization properties of anthocyanins with Titanium dioxide (TiO2) nanorods. The IPCE spectra show higher photovoltaic performance for dye sensitized TiO2nanorods using citric acid as solvent. The natural DSSC using citric acid as solvent shows a higher efficiency compared to other solvents. Hence citric acid performs to be a safe solvent for natural DSSC in boosting the photovoltaic performance and maintaining the stability of anthocyanins.

Keywords: Caesalpinia pulcherrima, citric acid, dye sensitized solar cells, TiO₂ nanorods

Procedia PDF Downloads 283
5725 Numerical Analysis of NOₓ Emission in Staged Combustion for the Optimization of Once-Through-Steam-Generators

Authors: Adrien Chatel, Ehsan Askari Mahvelati, Laurent Fitschy

Abstract:

Once-Through-Steam-Generators are commonly used in the oil-sand industry in the heavy fuel oil extraction process. They are composed of three main parts: the burner, the radiant and convective sections. Natural gas is burned through staged diffusive flames stabilized by the burner. The heat generated by the combustion is transferred to the water flowing through the piping system in the radiant and convective sections. The steam produced within the pipes is then directed to the ground to reduce the oil viscosity and allow its pumping. With the rapid development of the oil-sand industry, the number of OTSG in operation has increased as well as the associated emissions of environmental pollutants, especially the Nitrous Oxides (NOₓ). To limit the environmental degradation, various international environmental agencies have established regulations on the pollutant discharge and pushed to reduce the NOₓ release. To meet these constraints, OTSG constructors have to rely on more and more advanced tools to study and predict the NOₓ emission. With the increase of the computational resources, Computational Fluid Dynamics (CFD) has emerged as a flexible tool to analyze the combustion and pollutant formation process. Moreover, to optimize the burner operating condition regarding the NOx emission, field characterization and measurements are usually accomplished. However, these kinds of experimental campaigns are particularly time-consuming and sometimes even impossible for industrial plants with strict operation schedule constraints. Therefore, the application of CFD seems to be more adequate in order to provide guidelines on the NOₓ emission and reduction problem. In the present work, two different software are employed to simulate the combustion process in an OTSG, namely the commercial software ANSYS Fluent and the open source software OpenFOAM. RANS (Reynolds-Averaged Navier–Stokes) equations combined with the Eddy Dissipation Concept to model the combustion and closed by the k-epsilon model are solved. A mesh sensitivity analysis is performed to assess the independence of the solution on the mesh. In the first part, the results given by the two software are compared and confronted with experimental data as a mean to assess the numerical modelling. Flame temperatures and chemical composition are used as reference fields to perform this validation. Results show a fair agreement between experimental and numerical data. In the last part, OpenFOAM is employed to simulate several operating conditions, and an Emission Characteristic Map of the combustion system is generated. The sources of high NOₓ production inside the OTSG are pointed and correlated to the physics of the flow. CFD is, therefore, a useful tool for providing an insight into the NOₓ emission phenomena in OTSG. Sources of high NOₓ production can be identified, and operating conditions can be adjusted accordingly. With the help of RANS simulations, an Emission Characteristics Map can be produced and then be used as a guide for a field tune-up.

Keywords: combustion, computational fluid dynamics, nitrous oxides emission, once-through-steam-generators

Procedia PDF Downloads 108
5724 Design of a Tool for Generating Test Cases from BPMN

Authors: Prat Yotyawilai, Taratip Suwannasart

Abstract:

Business Process Model and Notation (BPMN) is more important in the business process and creating functional models, and is a standard for OMG, which becomes popular in various organizations and in education. Researches related to software testing based on models are prominent. Although most researches use the UML model in software testing, not many researches use the BPMN Model in creating test cases. Therefore, this research proposes a design of a tool for generating test cases from the BPMN. The model is analyzed and the details of the various components are extracted before creating a flow graph. Both details of components and the flow graph are used in generating test cases.

Keywords: software testing, test case, BPMN, flow graph

Procedia PDF Downloads 552
5723 The Effects of Mobile Communication on the Nigerian Populace

Authors: Chapman Eze Nnadozie

Abstract:

Communication, the activity of conveying information, remains a vital resource for the growth and development of any given society. Mobile communication, popularly known as global system for mobile communication (GSM) is a globally accepted standard for digital cellular communication. GSM, which is a wireless technology, remains the fastest growing communication means worldwide. Indeed, mobile phones have become a critical business tool and part of everyday life in both developed and developing countries. This study examines the effects of mobile communication on the Nigerian populace. The methodology used in this study is the survey research method with the main data collection tool as questionnaires. The questionnaires were administered to a total of seventy respondents in five cities across the country, namely: Aba, Enugu, Bauchi, Makurdi, and Lagos. The result reveals that though there is some quality of service issues, mobile communication has very significant positive efforts on the economic and social development of the Nigerian populace.

Keywords: effect, mobile communication, populace, GSM, wireless technology, mobile phone

Procedia PDF Downloads 267
5722 Evaluation of an Integrated Supersonic System for Inertial Extraction of CO₂ in Post-Combustion Streams of Fossil Fuel Operating Power Plants

Authors: Zarina Chokparova, Ighor Uzhinsky

Abstract:

Carbon dioxide emissions resulting from burning of the fossil fuels on large scales, such as oil industry or power plants, leads to a plenty of severe implications including global temperature raise, air pollution and other adverse impacts on the environment. Besides some precarious and costly ways for the alleviation of CO₂ emissions detriment in industrial scales (such as liquefaction of CO₂ and its deep-water treatment, application of adsorbents and membranes, which require careful consideration of drawback effects and their mitigation), one physically and commercially available technology for its capture and disposal is supersonic system for inertial extraction of CO₂ in after-combustion streams. Due to the flue gas with a carbon dioxide concentration of 10-15 volume percent being emitted from the combustion system, the waste stream represents a rather diluted condition at low pressure. The supersonic system induces a flue gas mixture stream to expand using a converge-and-diverge operating nozzle; the flow velocity increases to the supersonic ranges resulting in rapid drop of temperature and pressure. Thus, conversion of potential energy into the kinetic power causes a desublimation of CO₂. Solidified carbon dioxide can be sent to the separate vessel for further disposal. The major advantages of the current solution are its economic efficiency, physical stability, and compactness of the system, as well as needlessness of addition any chemical media. However, there are several challenges yet to be regarded to optimize the system: the way for increasing the size of separated CO₂ particles (as they are represented on a micrometers scale of effective diameter), reduction of the concomitant gas separated together with carbon dioxide and provision of CO₂ downstream flow purity. Moreover, determination of thermodynamic conditions of the vapor-solid mixture including specification of the valid and accurate equation of state remains to be an essential goal. Due to high speeds and temperatures reached during the process, the influence of the emitted heat should be considered, and the applicable solution model for the compressible flow need to be determined. In this report, a brief overview of the current technology status will be presented and a program for further evaluation of this approach is going to be proposed.

Keywords: CO₂ sequestration, converging diverging nozzle, fossil fuel power plant emissions, inertial CO₂ extraction, supersonic post-combustion carbon dioxide capture

Procedia PDF Downloads 139
5721 Transformative Concept of Logic to Islamic Science: Reflections on Al-Ghazālī's Influence

Authors: Umar Sheikh Tahir

Abstract:

Before al-Ghazālī, Islamic scholars perceived logic as an intrusive knowledge. The knowledge therefore, did not receive ample attention among scholars on how it should be adapted into Islamic sciences. General scholarship in that period rejects logic as an instrumental knowledge. This attitude became unquestionable to the scholars from different perspectives with diversification of suggestions in the pre-al-Ghazālī’s period. However, al-Ghazālī proclaimed with new perspective that transform Logic from ‘intrusive knowledge’ to a useful tool for Islamic sciences. This study explores the contributions of al-Ghazālī to epistemology regarding the use and the relevance of Logic. The study applies qualitative research methodology dealing strictly with secondary data from medieval age and contemporary sources. The study concludes that al-Ghazālī’s contributions which supported the transformation of Logic to useful tool in the Muslim world were drawn from his experience within Islamic tradition. He succeeded in reconciling Islamic tradition with the wisdom of Greek sciences.

Keywords: Al-Ghazālī, classical logic, epistemology, Islamdom and Islamic sciences

Procedia PDF Downloads 239
5720 Possibility of Creating Polygon Layers from Raster Layers Obtained by using Classic Image Processing Software: Case of Geological Map of Rwanda

Authors: Louis Nahimana

Abstract:

Most maps are in a raster or pdf format and it is not easy to get vector layers of published maps. Faced to the production of geological simplified map of the northern Lake Tanganyika countries without geological information in vector format, I tried a method of obtaining vector layers from raster layers created from geological maps of Rwanda and DR Congo in pdf and jpg format. The procedure was as follows: The original raster maps were georeferenced using ArcGIS10.2. Under Adobe Photoshop, map areas with the same color corresponding to a lithostratigraphic unit were selected all over the map and saved in a specific raster layer. Using the same image processing software Adobe Photoshop, each RGB raster layer was converted in grayscale type and improved before importation in ArcGIS10. After georeferencing, each lithostratigraphic raster layer was transformed into a multitude of polygons with the tool "Raster to Polygon (Conversion)". Thereafter, tool "Aggregate Polygons (Cartography)" allowed obtaining a single polygon layer. Repeating the same steps for each color corresponding to a homogeneous rock unit, it was possible to reconstruct the simplified geological constitution of Rwanda and the Democratic Republic of Congo in vector format. By using the tool «Append (Management)», vector layers obtained were combined with those from Burundi to achieve vector layers of the geology of the « Northern Lake Tanganyika countries ».

Keywords: creating raster layer under image processing software, raster to polygon, aggregate polygons, adobe photoshop

Procedia PDF Downloads 438
5719 The Studies of the Sorption Capabilities of the Porous Microspheres with Lignin

Authors: M. Goliszek, M. Sobiesiak, O. Sevastyanova, B. Podkoscielna

Abstract:

Lignin is one of three main constituents of biomass together with cellulose and hemicellulose. It is a complex biopolymer, which contains a large number of functional groups, including aliphatic and aromatic hydroxyl groups, carbohylic groups and methoxy groups in its structure, that is why it shows potential capacities for process of sorption. Lignin is a highly cross-linked polymer with a three-dimentional structure which can provide large surface area and pore volumes. It can also posses better dispersion, diffusion and mass transfer behavior in a field of the removal of, e.g., heavy-metal-ions or aromatic pollutions. In this work emulsion-suspension copolymerization method, to synthesize the porous microspheres of divinylbenzene (DVB), styrene (St) and lignin was used. There are also microspheres without the addition of lignin for comparison. Before the copolymerization, modification lignin with methacryloyl chloride, to improve its reactivity with other monomers was done. The physico-chemical properties of the obtained microspheres, e.g., pore structures (adsorption-desorption measurements), thermal properties (DSC), tendencies to swell and the actual shapes were also studied. Due to well-developed porous structure and the presence of functional groups our materials may have great potential in sorption processes. To estimate the sorption capabilities of the microspheres towards phenol and its chlorinated derivatives the off-line SPE (solid-phase extraction) method is going to be applied. This method has various advantages, including low-cost, easy to use and enables the rapid measurements for a large number of chemicals. The efficiency of the materials in removing phenols from aqueous solution and in desorption processes will be evaluated.

Keywords: microspheres, lignin, sorption, solid-phase extraction

Procedia PDF Downloads 180
5718 Application of Recycled Tungsten Carbide Powder for Fabrication of Iron Based Powder Metallurgy Alloy

Authors: Yukinori Taniguchi, Kazuyoshi Kurita, Kohei Mizuta, Keigo Nishitani, Ryuichi Fukuda

Abstract:

Tungsten carbide is widely used as a tool material in metal manufacturing process. Since tungsten is typical rare metal, establishment of recycle process of tungsten carbide tools and restore into cemented carbide material bring great impact to metal manufacturing industry. Recently, recycle process of tungsten carbide has been developed and established gradually. However, the demands for quality of cemented carbide tool are quite severe because hardness, toughness, anti-wear ability, heat resistance, fatigue strength and so on should be guaranteed for precision machining and tool life. Currently, it is hard to restore the recycled tungsten carbide powder entirely as raw material for new processed cemented carbide tool. In this study, to suggest positive use of recycled tungsten carbide powder, we have tried to fabricate a carbon based sintered steel which shows reinforced mechanical properties with recycled tungsten carbide powder. We have made set of newly designed sintered steels. Compression test of sintered specimen in density ratio of 0.85 (which means 15% porosity inside) has been conducted. As results, at least 1.7 times higher in nominal strength in the amount of 7.0 wt.% was shown in recycled WC powder. The strength reached to over 600 MPa for the Fe-WC-Co-Cu sintered alloy. Wear test has been conducted by using ball-on-disk type friction tester using 5 mm diameter ball with normal force of 2 N in the dry conditions. Wear amount after 1,000 m running distance shows that about 1.5 times longer life was shown in designed sintered alloy. Since results of tensile test showed that same tendency in previous testing, it is concluded that designed sintered alloy can be used for several mechanical parts with special strength and anti-wear ability in relatively low cost due to recycled tungsten carbide powder.

Keywords: tungsten carbide, recycle process, compression test, powder metallurgy, anti-wear ability

Procedia PDF Downloads 245
5717 Device for Reversible Hydrogen Isotope Storage with Aluminum Oxide Ceramic Case

Authors: Igor P. Maximkin, Arkady A. Yukhimchuk, Victor V. Baluev, Igor L. Malkov, Rafael K. Musyaev, Damir T. Sitdikov, Alexey V. Buchirin, Vasily V. Tikhonov

Abstract:

Minimization of tritium diffusion leakage when developing devices handling tritium-containing media is key problems whose solution will at least allow essential enhancement of radiation safety and minimization of diffusion losses of expensive tritium. One of the ways to solve this problem is to use Al₂O₃ high-strength non-porous ceramics as a structural material of the bed body. This alumina ceramics offers high strength characteristics, but its main advantages are low hydrogen permeability (as against the used structural material) and high dielectric properties. The latter enables direct induction heating of an hydride-forming metal without essential heating of the pressure and containment vessel. The use of alumina ceramics and induction heating allows: - essential reduction of tritium extraction time; - several orders reduction of tritium diffusion leakage; - more complete extraction of tritium from metal hydrides due to its higher heating up to melting in the event of final disposal of the device. The paper presents computational and experimental results for the tritium bed designed to absorb 6 liters of tritium. Titanium was used as hydrogen isotope sorbent. Results of hydrogen realize kinetic from hydride-forming metal, strength and cyclic service life tests are reported. Recommendations are also provided for the practical use of the given bed type.

Keywords: aluminum oxide ceramic, hydrogen pressure, hydrogen isotope storage, titanium hydride

Procedia PDF Downloads 398
5716 Influence of the Cooking Technique on the Iodine Content of Frozen Hake

Authors: F. Deng, R. Sanchez, A. Beltran, S. Maestre

Abstract:

The high nutritional value associated with seafood is related to the presence of essential trace elements. Moreover, seafood is considered an important source of energy, proteins, and long-chain polyunsaturated fatty acids. Generally, seafood is consumed cooked. Consequently, the nutritional value could be degraded. Seafood, such as fish, shellfish, and seaweed, could be considered as one of the main iodine sources. The deficient or excessive consumption of iodine could cause dysfunction and pathologies related to the thyroid gland. The main objective of this work is to evaluated iodine stability in hake (Merluccius) undergone different culinary techniques. The culinary process considered were: boiling, steaming, microwave cooking, baking, cooking en papillote (twisted cover with the shape of a sweet wrapper) and coating with a batter of flour and deep-frying. The determination of iodine was carried by Inductively Coupled Plasma Mass Spectrometry (ICP-MS). Regarding sample handling strategies, liquid-liquid extraction has demonstrated to be a powerful pre-concentration and clean-up approach for trace metal analysis by ICP techniques. Extraction with tetramethylammonium hydroxide (TMAH reagent) was used as a sample preparation method in this work. Based on the results, it can be concluded that the stability of iodine was degraded with the cooking processes. The major degradation was observed for the boiling and microwave cooking processes. The content of iodine in hake decreased up to 60% and 52%, respectively. However, if the boiling cooking liquid is preserved, this loss that has been generated during cooking is reduced. Only when the fish was cooked by following the cooking en papillote process the iodine content was preserved.

Keywords: cooking process, ICP-MS, iodine, hake

Procedia PDF Downloads 136
5715 Digital Forensics Showdown: Encase and FTK Head-to-Head

Authors: Rida Nasir, Waseem Iqbal

Abstract:

Due to the constant revolution in technology and the increase in anti-forensic techniques used by attackers to remove their traces, professionals often struggle to choose the best tool to be used in digital forensic investigations. This paper compares two of the most well-known and widely used licensed commercial tools, i.e., Encase & FTK. The comparison was drawn on various parameters and features to provide an authentic evaluation of licensed versions of these well-known commercial tools against various real-world scenarios. In order to discover the popularity of these tools within the digital forensic community, a survey was conducted publicly to determine the preferred choice. The dataset used is the Computer Forensics Reference Dataset (CFReDS). A total of 70 features were selected from various categories. Upon comparison, both FTK and EnCase produce remarkable results. However, each tool has some limitations, and none of the tools is declared best. The comparison drawn is completely unbiased, based on factual data.

Keywords: digital forensics, commercial tools, investigation, forensic evaluation

Procedia PDF Downloads 12
5714 Exploring Time-Series Phosphoproteomic Datasets in the Context of Network Models

Authors: Sandeep Kaur, Jenny Vuong, Marcel Julliard, Sean O'Donoghue

Abstract:

Time-series data are useful for modelling as they can enable model-evaluation. However, when reconstructing models from phosphoproteomic data, often non-exact methods are utilised, as the knowledge regarding the network structure, such as, which kinases and phosphatases lead to the observed phosphorylation state, is incomplete. Thus, such reactions are often hypothesised, which gives rise to uncertainty. Here, we propose a framework, implemented via a web-based tool (as an extension to Minardo), which given time-series phosphoproteomic datasets, can generate κ models. The incompleteness and uncertainty in the generated model and reactions are clearly presented to the user via the visual method. Furthermore, we demonstrate, via a toy EGF signalling model, the use of algorithmic verification to verify κ models. Manually formulated requirements were evaluated with regards to the model, leading to the highlighting of the nodes causing unsatisfiability (i.e. error causing nodes). We aim to integrate such methods into our web-based tool and demonstrate how the identified erroneous nodes can be presented to the user via the visual method. Thus, in this research we present a framework, to enable a user to explore phosphorylation proteomic time-series data in the context of models. The observer can visualise which reactions in the model are highly uncertain, and which nodes cause incorrect simulation outputs. A tool such as this enables an end-user to determine the empirical analysis to perform, to reduce uncertainty in the presented model - thus enabling a better understanding of the underlying system.

Keywords: κ-models, model verification, time-series phosphoproteomic datasets, uncertainty and error visualisation

Procedia PDF Downloads 248
5713 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design

Authors: Qing K. Zhu

Abstract:

Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.

Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise

Procedia PDF Downloads 251
5712 AS-Geo: Arbitrary-Sized Image Geolocalization with Learnable Geometric Enhancement Resizer

Authors: Huayuan Lu, Chunfang Yang, Ma Zhu, Baojun Qi, Yaqiong Qiao, Jiangqian Xu

Abstract:

Image geolocalization has great application prospects in fields such as autonomous driving and virtual/augmented reality. In practical application scenarios, the size of the image to be located is not fixed; it is impractical to train different networks for all possible sizes. When its size does not match the size of the input of the descriptor extraction model, existing image geolocalization methods usually directly scale or crop the image in some common ways. This will result in the loss of some information important to the geolocalization task, thus affecting the performance of the image geolocalization method. For example, excessive down-sampling can lead to blurred building contour, and inappropriate cropping can lead to the loss of key semantic elements, resulting in incorrect geolocation results. To address this problem, this paper designs a learnable image resizer and proposes an arbitrary-sized image geolocation method. (1) The designed learnable image resizer employs the self-attention mechanism to enhance the geometric features of the resized image. Firstly, it applies bilinear interpolation to the input image and its feature maps to obtain the initial resized image and the resized feature maps. Then, SKNet (selective kernel net) is used to approximate the best receptive field, thus keeping the geometric shapes as the original image. And SENet (squeeze and extraction net) is used to automatically select the feature maps with strong contour information, enhancing the geometric features. Finally, the enhanced geometric features are fused with the initial resized image, to obtain the final resized images. (2) The proposed image geolocalization method embeds the above image resizer as a fronting layer of the descriptor extraction network. It not only enables the network to be compatible with arbitrary-sized input images but also enhances the geometric features that are crucial to the image geolocalization task. Moreover, the triplet attention mechanism is added after the first convolutional layer of the backbone network to optimize the utilization of geometric elements extracted by the first convolutional layer. Finally, the local features extracted by the backbone network are aggregated to form image descriptors for image geolocalization. The proposed method was evaluated on several mainstream datasets, such as Pittsburgh30K, Tokyo24/7, and Places365. The results show that the proposed method has excellent size compatibility and compares favorably to recently mainstream geolocalization methods.

Keywords: image geolocalization, self-attention mechanism, image resizer, geometric feature

Procedia PDF Downloads 210
5711 Impacts of Climate Change and Natural Gas Operations on the Hydrology of Northeastern BC, Canada: Quantifying the Water Budget for Coles Lake

Authors: Sina Abadzadesahraei, Stephen Déry, John Rex

Abstract:

Climate research has repeatedly identified strong associations between anthropogenic emissions of ‘greenhouses gases’ and observed increases of global mean surface air temperature over the past century. Studies have also demonstrated that the degree of warming varies regionally. Canada is not exempt from this situation, and evidence is mounting that climate change is beginning to cause diverse impacts in both environmental and socio-economic spheres of interest. For example, northeastern British Columbia (BC), whose climate is controlled by a combination of maritime, continental and arctic influences, is warming at a greater rate than the remainder of the province. There are indications that these changing conditions are already leading to shifting patterns in the region’s hydrological cycle, and thus its available water resources. Coincident with these changes, northeastern BC is undergoing rapid development for oil and gas extraction: This depends largely on subsurface hydraulic fracturing (‘fracking’), which uses enormous amounts of freshwater. While this industrial activity has made substantial contributions to regional and provincial economies, it is important to ensure that sufficient and sustainable water supplies are available for all those dependent on the resource, including ecological systems. In this turn demands a comprehensive understanding of how water in all its forms interacts with landscapes, the atmosphere, and of the potential impacts of changing climatic conditions on these processes. The aim of this study is therefore to characterize and quantify all components of the water budget in the small watershed of Coles Lake (141.8 km², 100 km north of Fort Nelson, BC), through a combination of field observations and numerical modelling. Baseline information will aid the assessment of the sustainability of current and future plans for freshwater extraction by the oil and gas industry, and will help to maintain the precarious balance between economic and environmental well-being. This project is a perfect example of interdisciplinary research, in that it not only examines the hydrology of the region but also investigates how natural gas operations and growth can affect water resources. Therefore, a fruitful collaboration between academia, government and industry has been established to fulfill the objectives of this research in a meaningful manner. This project aims to provide numerous benefits to BC communities. Further, the outcome and detailed information of this research can be a huge asset to researchers examining the effect of climate change on water resources worldwide.

Keywords: northeastern British Columbia, water resources, climate change, oil and gas extraction

Procedia PDF Downloads 260
5710 Characterisation of Fractions Extracted from Sorghum Byproducts

Authors: Prima Luna, Afroditi Chatzifragkou, Dimitris Charalampopoulos

Abstract:

Sorghum byproducts, namely bran, stalk, and panicle are examples of lignocellulosic biomass. These raw materials contain large amounts of polysaccharides, in particular hemicelluloses, celluloses, and lignins, which if efficiently extracted, can be utilised for the development of a range of added value products with potential applications in agriculture and food packaging sectors. The aim of this study was to characterise fractions extracted from sorghum bran and stalk with regards to their physicochemical properties that could determine their applicability as food-packaging materials. A sequential alkaline extraction was applied for the isolation of cellulosic, hemicellulosic and lignin fractions from sorghum stalk and bran. Lignin content, phenolic content and antioxidant capacity were also investigated in the case of the lignin fraction. Thermal analysis using differential scanning calorimetry (DSC) and X-Ray Diffraction (XRD) revealed that the glass transition temperature (Tg) of cellulose fraction of the stalk was ~78.33 oC at amorphous state (~65%) and water content of ~5%. In terms of hemicellulose, the Tg value of stalk was slightly lower compared to bran at amorphous state (~54%) and had less water content (~2%). It is evident that hemicelluloses generally showed a lower thermal stability compared to cellulose, probably due to their lack of crystallinity. Additionally, bran had higher arabinose-to-xylose ratio (0.82) than the stalk, a fact that indicated its low crystallinity. Furthermore, lignin fraction had Tg value of ~93 oC at amorphous state (~11%). Stalk-derived lignin fraction contained more phenolic compounds (mainly consisting of p-coumaric and ferulic acid) and had higher lignin content and antioxidant capacity compared to bran-derived lignin fraction.

Keywords: alkaline extraction, bran, cellulose, hemicellulose, lignin, stalk

Procedia PDF Downloads 293
5709 Design, Development by Functional Analysis in UML and Static Test of a Multimedia Voice and Video Communication Platform on IP for a Use Adapted to the Context of Local Businesses in Lubumbashi

Authors: Blaise Fyama, Elie Museng, Grace Mukoma

Abstract:

In this article we present a java implementation of video telephony using the SIP protocol (Session Initiation Protocol). After a functional analysis of the SIP protocol, we relied on the work of Italian researchers of University of Parma-Italy to acquire adequate libraries for the development of our own communication tool. In order to optimize the code and improve the prototype, we used, in an incremental approach, test techniques based on a static analysis based on the evaluation of the complexity of the software with the application of metrics and the number cyclomatic of Mccabe. The objective is to promote the emergence of local start-ups producing IP video in a well understood local context. We have arrived at the creation of a video telephony tool whose code is optimized.

Keywords: static analysis, coding complexity metric mccabe, Sip, uml

Procedia PDF Downloads 112
5708 The Use of Budgeting as an Effective Management Tool for Small, Medium and Micro Enterprises during COVID-19 Pandemic: A Perspective from South Africa

Authors: Abongile Zweni, Grate Moyo, Ricardo Peters, Bingwen Yan

Abstract:

Budgets are one of the most important tools that organisations, big or small, need to use as management tools. When organisations, particularly Small, Medium and Micro Enterprises (SMMEs), do not use budgets, they are bound to fail in their infancy stage. The aim of this study was to assess whether or not SMMEs in South Africa used budgets as an effective management tool during the COVID-19 pandemic. For the purposes of this study, data was collected using an online questionnaire (survey). This study used the quantitative research approach. The study used descriptive statistics to analyse the research question. The study found that most SMMEs did not use budgets during the COVID-19 pandemic; one of the reasons, amongst others, was that most of them had to close down during the lockdown, and some of them did not even qualify for government bailout or government grants.

Keywords: budget management, SMMEs, COVID-19, South Africa

Procedia PDF Downloads 186