Search results for: filtered closures
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 229

Search results for: filtered closures

109 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas

Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards

Abstract:

Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.

Keywords: airborne laser scanning, digital terrain models, filtering, forested areas

Procedia PDF Downloads 117
108 Restored CO₂ from Flue Gas and Utilization by Converting to Methanol by 3 Step Processes: Steam Reforming, Reverse Water Gas Shift and Hydrogenation

Authors: Rujira Jitrwung, Kuntima Krekkeitsakul, Weerawat Patthaveekongka, Chiraphat Kumpidet, Jarukit Tepkeaw, Krissana Jaikengdee, Anantachai Wannajampa

Abstract:

Flue gas discharging from coal fired or gas combustion power plant contains around 12% Carbon dioxide (CO₂), 6% Oxygen (O₂), and 82% Nitrogen (N₂).CO₂ is a greenhouse gas which has been concerned to the global warming. Carbon Capture, Utilization, and Storage (CCUS) is a topic which is a tool to deal with this CO₂ realization. Flue gas is drawn down from the chimney and filtered, then it is compressed to build up the pressure until 8 bar. This compressed flue gas is sent to three stages Pressure Swing Adsorption (PSA), which is filled with activated carbon. Experiments were showed the optimum adsorption pressure at 7bar, which CO₂ can be adsorbed step by step in 1st, 2nd, and 3rd stage, obtaining CO₂ concentration 29.8, 66.4, and 96.7 %, respectively. The mixed gas concentration from the last step is composed of 96.7% CO₂,2.7% N₂, and 0.6%O₂. This mixed CO₂product gas obtained from 3 stages PSA contained high concentration CO₂, which is ready to use for methanol synthesis. The mixed CO₂ was experimented in 5 Liter/Day of methanol synthesis reactor skid by 3 step processes as followed steam reforming, reverse water gas shift, and then hydrogenation. The result showed that proportional of mixed CO₂ and CH₄ 70/30, 50/50, 30/70 % (v/v), and 10/90 yielded methanol 2.4, 4.3, 5.6, and 6.0 Liter/day and save CO₂ 40, 30, 20, and 5 % respectively. The optimum condition resulted both methanol yield and CO₂ consumption using CO₂/CH₄ ratio 43/57 % (v/v), which yielded 4.8 Liter/day methanol and save CO₂ 27% comparing with traditional methanol production from methane steam reforming (5 Liter/day)and absent CO₂ consumption.

Keywords: carbon capture utilization and storage, pressure swing adsorption, reforming, reverse water gas shift, methanol

Procedia PDF Downloads 151
107 Effective Dose and Size Specific Dose Estimation with and without Tube Current Modulation for Thoracic Computed Tomography Examinations: A Phantom Study

Authors: S. Gharbi, S. Labidi, M. Mars, M. Chelli, F. Ladeb

Abstract:

The purpose of this study is to reduce radiation dose for chest CT examination by including Tube Current Modulation (TCM) to a standard CT protocol. A scan of an anthropomorphic male Alderson phantom was performed on a 128-slice scanner. The estimation of effective dose (ED) in both scans with and without mAs modulation was done via multiplication of Dose Length Product (DLP) to a conversion factor. Results were compared to those measured with a CT-Expo software. The size specific dose estimation (SSDE) values were obtained by multiplication of the volume CT dose index (CTDIvol) with a conversion size factor related to the phantom’s effective diameter. Objective assessment of image quality was performed with Signal to Noise Ratio (SNR) measurements in phantom. SPSS software was used for data analysis. Results showed including CARE Dose 4D; ED was lowered by 48.35% and 51.51% using DLP and CT-expo, respectively. In addition, ED ranges between 7.01 mSv and 6.6 mSv in case of standard protocol, while it ranges between 3.62 mSv and 3.2 mSv with TCM. Similar results are found for SSDE; dose was higher without TCM of 16.25 mGy and was lower by 48.8% including TCM. The SNR values calculated were significantly different (p=0.03<0.05). The highest one is measured on images acquired with TCM and reconstructed with Filtered back projection (FBP). In conclusion, this study proves the potential of TCM technique in SSDE and ED reduction and in conserving image quality with high diagnostic reference level for thoracic CT examinations.

Keywords: anthropomorphic phantom, computed tomography, CT-expo, radiation dose

Procedia PDF Downloads 191
106 Integrated Geotechnical and Geophysical Investigation of a Proposed Construction Site at Mowe, Southwestern Nigeria

Authors: Kayode Festus Oyedele, Sunday Oladele, Adaora Chibundu Nduka

Abstract:

The subsurface of a proposed site for building development in Mowe, Nigeria, using Standard Penetration Test (SPT) and Cone Penetrometer Test (CPT) supplemented with Horizontal Electrical Profiling (HEP) was investigated with the aim of evaluating the suitability of the strata for foundation materials. Four SPT and CPT were implemented using 10 tonnes hammer. HEP utilizing Wenner array were performed with inter-electrode spacing of 10 – 60 m along four traverses coincident with each of the SPT and CPT. The HEP data were processed using DIPRO software and textural filtering of the resulting resistivity sections was implemented to enable delineation of hidden layers. Sandy lateritic clay, silty lateritic clay, clay, clayey sand and sand horizons were delineated. The SPT “N” value defined very soft to soft sandy lateritic (<4), stiff silty lateritic clay (7 – 12), very stiff silty clay (12 - 15), clayey sand (15- 20) and sand (27 – 37). Sandy lateritic clay (5-40 kg/cm2) and silty lateritic clay (25 - 65 kg/cm2) were defined from the CPT response. Sandy lateritic clay (220-750 Ωm), clay (< 50 Ωm) and sand (415-5359 Ωm) were delineated from the resistivity sections with two thin layers of silty lateritic clay and clayey sand defined in the texturally filtered resistivity sections. This study concluded that the presence of incompetent thick clayey materials (18 m) beneath the study area makes it unsuitable for shallow foundation. Deep foundation involving piling through the clayey layers to the competent sand at 20 m depth was recommended.

Keywords: cone penetrometer, foundation, lithologic texture, resistivity section, standard penetration test

Procedia PDF Downloads 221
105 Distributional and Developmental Analysis of PM2.5 in Beijing, China

Authors: Alexander K. Guo

Abstract:

PM2.5 poses a large threat to people’s health and the environment and is an issue of large concern in Beijing, brought to the attention of the government by the media. In addition, both the United States Embassy in Beijing and the government of China have increased monitoring of PM2.5 in recent years, and have made real-time data available to the public. This report utilizes hourly historical data (2008-2016) from the U.S. Embassy in Beijing for the first time. The first objective was to attempt to fit probability distributions to the data to better predict a number of days exceeding the standard, and the second was to uncover any yearly, seasonal, monthly, daily, and hourly patterns and trends that may arise to better understand of air control policy. In these data, 66,650 hours and 2687 days provided valid data. Lognormal, gamma, and Weibull distributions were fit to the data through an estimation of parameters. The Chi-squared test was employed to compare the actual data with the fitted distributions. The data were used to uncover trends, patterns, and improvements in PM2.5 concentration over the period of time with valid data in addition to specific periods of time that received large amounts of media attention, analyzed to gain a better understanding of causes of air pollution. The data show a clear indication that Beijing’s air quality is unhealthy, with an average of 94.07µg/m3 across all 66,650 hours with valid data. It was found that no distribution fit the entire dataset of all 2687 days well, but each of the three above distribution types was optimal in at least one of the yearly data sets, with the lognormal distribution found to fit recent years better. An improvement in air quality beginning in 2014 was discovered, with the first five months of 2016 reporting an average PM2.5 concentration that is 23.8% lower than the average of the same period in all years, perhaps the result of various new pollution-control policies. It was also found that the winter and fall months contained more days in both good and extremely polluted categories, leading to a higher average but a comparable median in these months. Additionally, the evening hours, especially in the winter, reported much higher PM2.5 concentrations than the afternoon hours, possibly due to the prohibition of trucks in the city in the daytime and the increased use of coal for heating in the colder months when residents are home in the evening. Lastly, through analysis of special intervals that attracted media attention for either unnaturally good or bad air quality, the government’s temporary pollution control measures, such as more intensive road-space rationing and factory closures, are shown to be effective. In summary, air quality in Beijing is improving steadily and do follow standard probability distributions to an extent, but still needs improvement. Analysis will be updated when new data become available.

Keywords: Beijing, distribution, patterns, pm2.5, trends

Procedia PDF Downloads 219
104 The Adoption and Use of Social Media as a Source of Information by Egyptian Government Journalists

Authors: Essam Mansour

Abstract:

This study purposes to explore the adoption and use of social media as a source of information by Egyptian government journalists. It applied a survey with a total of 386 journalists representing the three official newspapers of Egypt. Findings showed that 27.2% of journalists were found to not use social media, mainly males (69.7%), older than 40 years (77.7%) and mostly with a BA degree (80.4%). On the other hand, 72.8% of them were found to use these platforms who were also males (59.1%), younger than 40 years (65.9%) and mostly with a BA degree (93.2%). More than two-thirds (69.9%) were somewhat old users whose experience ranged from seven to ten years, and more than two-thirds (73.5%) have been heavily using these platforms (four to more than six hours a day. Such results confirm that a large number (95.7%) of users were found to be at least advanced users. Social media users’ home and work were the most significant places to access these platforms, which were found to be easy and useful to use. Most types of social media used were social news, media sharing and micro blogging, blogs comments and forums, social networking sites and bookmarking sites to perform tasks, such as finding information, making communication, keeping up to date, checking materials, sharing information and making discussions. A large number of users tend to accept these media platforms to be a source of information since they are accessible, linked references updated sources, accurate, promote current work, convenient, secured, credible, reliable, stabled, easily identified, copyrighted, build confident and contain filtered information. However, lack of know-how to cite sources, followed by lack of credibility of the source of news, lack of quality of information sources and lack of time were at least significant to journalists when using social media platforms.

Keywords: social media, social networking sites, newspapers, journalists, Egypt

Procedia PDF Downloads 231
103 Predicting the Human Impact of Natural Onset Disasters Using Pattern Recognition Techniques and Rule Based Clustering

Authors: Sara Hasani

Abstract:

This research focuses on natural sudden onset disasters characterised as ‘occurring with little or no warning and often cause excessive injuries far surpassing the national response capacities’. Based on the panel analysis of the historic record of 4,252 natural onset disasters between 1980 to 2015, a predictive method was developed to predict the human impact of the disaster (fatality, injured, homeless) with less than 3% of errors. The geographical dispersion of the disasters includes every country where the data were available and cross-examined from various humanitarian sources. The records were then filtered into 4252 records of the disasters where the five predictive variables (disaster type, HDI, DRI, population, and population density) were clearly stated. The procedure was designed based on a combination of pattern recognition techniques and rule-based clustering for prediction and discrimination analysis to validate the results further. The result indicates that there is a relationship between the disaster human impact and the five socio-economic characteristics of the affected country mentioned above. As a result, a framework was put forward, which could predict the disaster’s human impact based on their severity rank in the early hours of disaster strike. The predictions in this model were outlined in two worst and best-case scenarios, which respectively inform the lower range and higher range of the prediction. A necessity to develop the predictive framework can be highlighted by noticing that despite the existing research in literature, a framework for predicting the human impact and estimating the needs at the time of the disaster is yet to be developed. This can further be used to allocate the resources at the response phase of the disaster where the data is scarce.

Keywords: disaster management, natural disaster, pattern recognition, prediction

Procedia PDF Downloads 128
102 Non-Uniform Filter Banks-based Minimum Distance to Riemannian Mean Classifition in Motor Imagery Brain-Computer Interface

Authors: Ping Tan, Xiaomeng Su, Yi Shen

Abstract:

The motion intention in the motor imagery braincomputer interface is identified by classifying the event-related desynchronization (ERD) and event-related synchronization ERS characteristics of sensorimotor rhythm (SMR) in EEG signals. When the subject imagines different limbs or different parts moving, the rhythm components and bandwidth will change, which varies from person to person. How to find the effective sensorimotor frequency band of subjects is directly related to the classification accuracy of brain-computer interface. To solve this problem, this paper proposes a Minimum Distance to Riemannian Mean Classification method based on Non-Uniform Filter Banks. During the training phase, the EEG signals are decomposed into multiple different bandwidt signals by using multiple band-pass filters firstly; Then the spatial covariance characteristics of each frequency band signal are computered to be as the feature vectors. these feature vectors will be classified by the MDRM (Minimum Distance to Riemannian Mean) method, and cross validation is employed to obtain the effective sensorimotor frequency bands. During the test phase, the test signals are filtered by the bandpass filter of the effective sensorimotor frequency bands, and the extracted spatial covariance feature vectors will be classified by using the MDRM. Experiments on the BCI competition IV 2a dataset show that the proposed method is superior to other classification methods.

Keywords: non-uniform filter banks, motor imagery, brain-computer interface, minimum distance to Riemannian mean

Procedia PDF Downloads 82
101 Egg Hatching Inhibition Activity of Volatile Oils Extracted from Some Medicinal and Aromatic Plants against Root-Knot Nematode Meloidogyne hapla

Authors: Anil F. Felek, Mehmet M. Ozcan, Faruk Akyazi

Abstract:

Volatile oils of medicinal and aromatic plants are important for managing nematological problems in agriculture. In present study, volatile oils extracted from five medicinal and aromatic plants including Origanum onites (flower+steam+leaf), Salvia officinalis (leaf), Lippia citriodora (leaf+seed), Mentha spicata (leaf) and Mentha longifolia (leaf) were tested for egg hatching inhibition activity against root-knot nematode Meloidogyne hapla under laboratory conditions. The essential oils were extracted using water distillation method with a Clevenger system. For the homogenisation process of the oils, 2% gum arabic solution was used and 4 µl oils was added into 1ml filtered gum arabic solution to prepare the last stock solution. 5 ml of stock solution and 1 ml of M. hapla egg suspension (about 100 eggs) were added into petri dishes. Gum arabic solution was used as control. Seven days after exposure to oils at room temperature (26±2 °C), the cumulative hatched and unhatched eggs were counted under 40X inverted light microscope and Abbott’s formula was used to calculate egg hatching inhibition rates. As a result, the highest inhibition rate was found as 54% for O. onites. In addition, the other inhibition rates varied as 31.4%, 21.6%, 23.8%, 25.67% for the other plants, S. officinalis, M. longifolia, M. spicata and L. citriodora, respectively. Carvacrol was found as the main component (68.8%) of O. onites followed by Thujone 27.77% for S. officinalis, I-Menthone 76.92% for M. longifolia, Carvone 27.05% for M. spicata and Citral 19.32% for L. citriodora.

Keywords: egg hatching, Meloidogyne hapla, medicinal and aromatic plants, root-knot nematodes, volatile oils

Procedia PDF Downloads 228
100 A Structured Mechanism for Identifying Political Influencers on Social Media Platforms: Top 10 Saudi Political Twitter Users

Authors: Ahmad Alsolami, Darren Mundy, Manuel Hernandez-Perez

Abstract:

Social media networks, such as Twitter, offer the perfect opportunity to either positively or negatively affect political attitudes on large audiences. The existence of influential users who have developed a reputation for their knowledge and experience of specific topics is a major factor contributing to this impact. Therefore, knowledge of the mechanisms to identify influential users on social media is vital for understanding their effect on their audience. The concept of the influential user is related to the concept of opinion leaders' to indicate that ideas first flow from mass media to opinion leaders and then to the rest of the population. Hence, the objective of this research was to provide reliable and accurate structural mechanisms to identify influential users, which could be applied to different platforms, places, and subjects. Twitter was selected as the platform of interest, and Saudi Arabia as the context for the investigation. These were selected because Saudi Arabia has a large number of Twitter users, some of whom are considerably active in setting agendas and disseminating ideas. The study considered the scientific methods that have been used to identify public opinion leaders before, utilizing metrics software on Twitter. The key findings propose multiple novel metrics to compare Twitter influencers, including the number of followers, social authority and the use of political hashtags, and four secondary filtering measures. Thus, using ratio and percentage calculations to classify the most influential users, Twitter accounts were filtered, analyzed and included. The structured approach is used as a mechanism to explore the top ten influencers on Twitter from the political domain in Saudi Arabia.

Keywords: Twitter, influencers, structured mechanism, Saudi Arabia

Procedia PDF Downloads 94
99 Reuse of Wastewater After Pretreatment Under Teril and Sand in Bechar City

Authors: Sara Seddiki, Maazouzi Abdelhak

Abstract:

The main objective of this modest work is to follow the physicochemical and bacteriological evolution of the wastewater from the town of Bechar subjected to purification by filtration according to various local supports, namely Sable and Terrill by reducing nuisances that undergo the receiving environment (Oued Bechar) and therefore make this water source reusable in different areas. The study first made it possible to characterize the urban wastewater of the Bechar wadi, which presents an environmental threat, thus allowing an estimation of the pollutant load, the chemical oxygen demand COD (145 mg / l) and the biological oxygen demand BOD5 (72 mg / l) revealed that these waters are less biodegradable (COD / BOD5 ratio = 0.62), have a fairly high conductivity (2.76 mS/cm), and high levels of mineral matter presented by chlorides and sulphates 390 and 596.1 mg / l respectively, with a pH of 8.1. The characterization of the sand dune (Beni Abbes) shows that quartz (97%) is the most present mineral. The granular analysis allowed us to determine certain parameters like the uniformity coefficient (CU) and the equivalent diameter, and scanning electron microscope (SEM) observations and X-ray analysis were performed. The study of filtered wastewater shows satisfactory and very encouraging treatment results, with complete elimination of total coliforms and streptococci and a good reduction of total aerobic germs in the sand and clay-sand filter. A good yield has been reported in the sand Terrill filter for the reduction of turbidity. The rates of reduction of organic matter in terms of the biological oxygen demand, in chemical oxygen demand recorded, are of the order of 60%. The elimination of sulphates is 40% for the sand filter.

Keywords: urban wastewater, filtration, bacteriological and physicochemical parameters, sand, Terrill, Oued Bechar

Procedia PDF Downloads 54
98 Management Effects on Different Sustainable Agricultural with Diverse Topography

Authors: Kusay Wheib, Alexandra Krvchenko

Abstract:

Crop yields are influenced by many factors, including natural ones, such as soil and environmental characteristics of the agricultural land, as well as manmade ones, such as management applications. One of the factors that frequently affect crop yields in undulating Midwest landscapes is topography, which controls the movement of water and nutrients necessary for plant life. The main objective of this study is to examine how field topography influences performance of different management practices in undulated terrain of southwest Michigan. A total of 26 agricultural fields, ranging in size from 1.1 to 7.4 ha, from the Scale-Up at Kellogg Biological Station were included in the study. The two studied factors were crop species with three levels, i.e., corn (Zea mays L.) soybean (Glycine max L.), and wheat (Triticum aestivum L.), and management practice with three levels, i.e., conventional, low input, and organic managements. They were compared under three contrasting topographical settings, namely, summit (includes summits and shoulders), slope (includes backslopes), and depression (includes footslope and toeslope). Yield data of years 2007 through 2012 was processed, cleaned, and filtered, average yield then was calculated for each field, topographic setting, and year. Topography parameters, including terrain, slope, curvature, flow direction and wetness index were computed under ArcGIS environment for each topographic class of each field to seek their effects on yield. Results showed that topographical depressions produced greatest yields in most studied fields, while managements with chemical inputs, both low input and conventional, resulted in higher yields than the organic management.

Keywords: sustainable agriculture, precision agriculture, topography, yield

Procedia PDF Downloads 90
97 The Effects of Various Storage Scenarios on the Viability of Rooibos Tea Characteristically Used for Research

Authors: Daniella L. Pereira, Emeliana G. Imperial, Ingrid Webster, Ian Wiid, Hans Strijdom, Nireshni Chellan, Sanet H. Kotzé

Abstract:

Rooibos (Aspalathus linearis) is a shrub-like bush native to the Western Cape of South Africa and commonly consumed as a herbal tea. Interest on the anti-oxidant capabilities of the tea have risen based on anecdotal evidence. Rooibos contains polyphenols that contribute to the overall antioxidant capacity of the tea. These polyphenols have been reported to attenuate the effects of oxidative stress in biological systems. The bioavailability of these compounds is compromised when exposed to light, pH fluctuations, and oxidation. It is crucial to evaluate whether the polyphenols in a typical rooibos solution remain constant over time when administered to rats in a research environment. This study aimed to determine the effects of various storage scenarios on the phenolic composition of rooibos tea commonly administered to rodents in experimental studies. A standardised aqueous solution of rooibos tea was filtered and divided into three samples namely fresh, refrigerated, and frozen. Samples were stored in air tight, light excluding bottles. Refrigerated samples were stored at 4°C for seven days. Frozen samples were stored for fourteen days at -20°C. Each sample consisted of two subgroups labeled day 1 and day 7. Teas marked day 7 of each group were kept in air tight, light protected bottles at room temperature for an additional week. All samples (n=6) were freeze-dried and underwent polyphenol characterization using liquid chromatography-mass spectrometry. The phenolic composition remained constant throughout all groups. This indicates that rooibos tea can be safely stored at the above conditions without compromising the phenolic viability of the tea typically used for research purposes.

Keywords: Aspalathus linearis, experimental studies, polyphenols, storage

Procedia PDF Downloads 199
96 Preparation of Nano-Scaled linbo3 by Polyol Method

Authors: Gabriella Dravecz, László Péter, Zsolt Kis

Abstract:

Abstract— The growth of optical LiNbO3 single crystal and its physical and chemical properties are well known on the macroscopic scale. Nowadays the rare-earth doped single crystals became important for coherent quantum optical experiments: electromagnetically induced transparency, slow down of light pulses, coherent quantum memory. The expansion of applications is increasingly requiring the production of nano scaled LiNbO3 particles. For example, rare-earth doped nanoscaled particles of lithium niobate can be act like single photon source which can be the bases of a coding system of the quantum computer providing complete inaccessibility to strangers. The polyol method is a chemical synthesis where oxide formation occurs instead of hydroxide because of the high temperature. Moreover the polyol medium limits the growth and agglomeration of the grains producing particles with the diameter of 30-200 nm. In this work nano scaled LiNbO3 was prepared by the polyol method. The starting materials (niobium oxalate and LiOH) were diluted in H2O2. Then it was suspended in ethylene glycol and heated up to about the boiling point of the mixture with intensive stirring. After the thermal equilibrium was reached, the mixture was kept in this temperature for 4 hours. The suspension was cooled overnight. The mixture was centrifuged and the particles were filtered. Dynamic Light Scattering (DLS) measurement was carried out and the size of the particles were found to be 80-100 nms. This was confirmed by Scanning Electron Microscope (SEM) investigations. The element analysis of SEM showed large amount of Nb in the sample. The production of LiNbO3 nano particles were succesful by the polyol method. The agglomeration of the particles were avoided and the size of 80-100nm could be reached.

Keywords: lithium-niobate, nanoparticles, polyol, SEM

Procedia PDF Downloads 105
95 A Structured Mechanism for Identifying Political Influencers on Social Media Platforms Top 10 Saudi Political Twitter Users

Authors: Ahmad Alsolami, Darren Mundy, Manuel Hernandez-Perez

Abstract:

Social media networks, such as Twitter, offer the perfect opportunity to either positively or negatively affect political attitudes on large audiences. A most important factor contributing to this effect is the existence of influential users, who have developed a reputation for their awareness and experience on specific subjects. Therefore, knowledge of the mechanisms to identify influential users on social media is vital for understanding their effect on their audience. The concept of the influential user is based on the pioneering work of Katz and Lazarsfeld (1959), who created the concept of opinion leaders' to indicate that ideas first flow from mass media to opinion leaders and then to the rest of the population. Hence, the objective of this research was to provide reliable and accurate structural mechanisms to identify influential users, which could be applied to different platforms, places, and subjects. Twitter was selected as the platform of interest, and Saudi Arabia as the context for the investigation. These were selected because Saudi Arabia has a large number of Twitter users, some of whom are considerably active in setting agendas and disseminating ideas. The study considered the scientific methods that have been used to identify public opinion leaders before, utilizing metrics software on Twitter. The key findings propose multiple novel metrics to compare Twitter influencers, including the number of followers, social authority and the use of political hashtags, and four secondary filtering measures. Thus, using ratio and percentage calculations to classify the most influential users, Twitter accounts were filtered, analyzed and included. The structured approach is used as a mechanism to explore the top ten influencers on Twitter from the political domain in Saudi Arabia.

Keywords: twitter, influencers, structured mechanism, Saudi Arabia

Procedia PDF Downloads 104
94 Comparative Analysis of the Treatment of Okra Seed and Soy Beans Oil with Crude Enzyme Extract from Malted Rice

Authors: Eduzor Esther, Uhiara Ngozi, Ya’u Abubakar Umar, Anayo Jacob Gabriel, Umar Ahmed

Abstract:

The study investigated the characteristic effect of treating okra seed and soybeans seed oil with crude enzymes extract from malted rice. The oils from okra seeds and soybeans were obtained by solvent extraction method using N-hexane solvent. Soybeans seeds had higher percentage oil yield than okra seed. 250ml of each oil was thoroughly mixed with 5ml of the malted rice extract at 400C for 5mins and then filtered and regarded as treated oil while another batch of 250ml of each oil was not mixed with the malted rice extract and regarded as untreated oil. All the oils were analyzed for specific gravity, refractive index, emulsification capacity, absortivity, TSS and viscosity. Treated okra seed and soybeans oil gave higher values for specific gravity, than the untreated oil for okra seed and soybeans oil respectively. The emulsification capacity values were also higher for treated oils, when compared to the untreated oil, for okra seed and soybeans oil respectively. Treated okra seed and soybeans oil also had higher range of values for absorptivity, than the untreated oil for okra seed and soybeans respectively. The ranges of T.S.S values of the treated oil were also higher, than those of the untreated oil for okra seed and soybeans respectively. The results of viscosity showed that the treated oil had higher values, than the untreated oil for okra seed and soybeans oil respectively. However, the results of refractive index showed that the untreated oils had higher values ranges of than the treated oils for okra seed and soybeans respectively. Treated oil show better quality in respect to the parameters analyst, except the refractive index which is slightly less but also is within the rangiest of standard, the oils are high in unsaturation especially okra oil when compared with soya beans oil. It is recommended that, treated oil of okra seeds and soya beans can serve better than many oils that presently in use such as ground nut oil, palm oil and cotton seeds oil.

Keywords: extract, malted, oil, okra, rice, seed, soybeans

Procedia PDF Downloads 405
93 Modeling Average Paths Traveled by Ferry Vessels Using AIS Data

Authors: Devin Simmons

Abstract:

At the USDOT’s Bureau of Transportation Statistics, a biannual census of ferry operators in the U.S. is conducted, with results such as route mileage used to determine federal funding levels for operators. AIS data allows for the possibility of using GIS software and geographical methods to confirm operator-reported mileage for individual ferry routes. As part of the USDOT’s work on the ferry census, an algorithm was developed that uses AIS data for ferry vessels in conjunction with known ferry terminal locations to model the average route travelled for use as both a cartographic product and confirmation of operator-reported mileage. AIS data from each vessel is first analyzed to determine individual journeys based on the vessel’s velocity, and changes in velocity over time. These trips are then converted to geographic linestring objects. Using the terminal locations, the algorithm then determines whether the trip represented a known ferry route. Given a large enough dataset, routes will be represented by multiple trip linestrings, which are then filtered by DBSCAN spatial clustering to remove outliers. Finally, these remaining trips are ready to be averaged into one route. The algorithm interpolates the point on each trip linestring that represents the start point. From these start points, a centroid is calculated, and the first point of the average route is determined. Each trip is interpolated again to find the point that represents one percent of the journey’s completion, and the centroid of those points is used as the next point in the average route, and so on until 100 points have been calculated. Routes created using this algorithm have shown demonstrable improvement over previous methods, which included the implementation of a LOESS model. Additionally, the algorithm greatly reduces the amount of manual digitizing needed to visualize ferry activity.

Keywords: ferry vessels, transportation, modeling, AIS data

Procedia PDF Downloads 132
92 Computational Screening of Secretory Proteins with Brain-Specific Expression in Glioblastoma Multiforme

Authors: Sumera, Sanila Amber, Fatima Javed Mirza, Amjad Ali, Saadia Zahid

Abstract:

Glioblastoma multiforme (GBM) is a widely spread and fatal primary brain tumor with an increased risk of relapse in spite of aggressive treatment. The current procedures for GBM diagnosis include invasive procedures i.e. resection or biopsy, to acquire tumor mass. Implementation of negligibly invasive tests as a potential diagnostic technique and biofluid-based monitoring of GBM stresses on discovering biomarkers in CSF and blood. Therefore, we performed a comprehensive in silico analysis to identify potential circulating biomarkers for GBM. Initially, six gene and protein databases were utilized to mine brain-specific proteins. The resulting proteins were filtered using a channel of five tools to predict the secretory proteins. Subsequently, the expression profile of the secreted proteins was verified in the brain and blood using two databases. Additional verification of the resulting proteins was done using Plasma Proteome Database (PPD) to confirm their presence in blood. The final set of proteins was searched in literature for their relationship with GBM, keeping a special emphasis on secretome proteome. 2145 proteins were firstly mined as brain-specific, out of which 69 proteins were identified as secretory in nature. Verification of expression profile in brain and blood eliminated 58 proteins from the 69 proteins, providing a final list of 11 proteins. Further verification of these 11 proteins further eliminated 2 proteins, giving a final set of nine secretory proteins i.e. OPCML, NPTX1, LGI1, CNTN2, LY6H, SLIT1, CREG2, GDF1 and SERPINI1. Out of these 9 proteins, 7 were found to be linked to GBM, whereas 2 proteins are not investigated in GBM so far. We propose that these secretory proteins can serve as potential circulating biomarker signatures of GBM and will facilitate the development of minimally invasive diagnostic methods and novel therapeutic interventions for GBM.

Keywords: glioblastoma multiforme, secretory proteins, brain secretome, biomarkers

Procedia PDF Downloads 120
91 Building Transparent Supply Chains through Digital Tracing

Authors: Penina Orenstein

Abstract:

In today’s world, particularly with COVID-19 a constant worldwide threat, organizations need greater visibility over their supply chains more than ever before, in order to find areas for improvement and greater efficiency, reduce the chances of disruption and stay competitive. The concept of supply chain mapping is one where every process and route is mapped in detail between each vendor and supplier. The simplest method of mapping involves sourcing publicly available data including news and financial information concerning relationships between suppliers. An additional layer of information would be disclosed by large, direct suppliers about their production and logistics sites. While this method has the advantage of not requiring any input from suppliers, it also doesn’t allow for much transparency beyond the first supplier tier and may generate irrelevant data—noise—that must be filtered out to find the actionable data. The primary goal of this research is to build data maps of supply chains by focusing on a layered approach. Using these maps, the secondary goal is to address the question as to whether the supply chain is re-engineered to make improvements, for example, to lower the carbon footprint. Using a drill-down approach, the end result is a comprehensive map detailing the linkages between tier-one, tier-two, and tier-three suppliers super-imposed on a geographical map. The driving force behind this idea is to be able to trace individual parts to the exact site where they’re manufactured. In this way, companies can ensure sustainability practices from the production of raw materials through the finished goods. The approach allows companies to identify and anticipate vulnerabilities in their supply chain. It unlocks predictive analytics capabilities and enables them to act proactively. The research is particularly compelling because it unites network science theory with empirical data and presents the results in a visual, intuitive manner.

Keywords: data mining, supply chain, empirical research, data mapping

Procedia PDF Downloads 146
90 Modified Weibull Approach for Bridge Deterioration Modelling

Authors: Niroshan K. Walgama Wellalage, Tieling Zhang, Richard Dwight

Abstract:

State-based Markov deterioration models (SMDM) sometimes fail to find accurate transition probability matrix (TPM) values, and hence lead to invalid future condition prediction or incorrect average deterioration rates mainly due to drawbacks of existing nonlinear optimization-based algorithms and/or subjective function types used for regression analysis. Furthermore, a set of separate functions for each condition state with age cannot be directly derived by using Markov model for a given bridge element group, which however is of interest to industrial partners. This paper presents a new approach for generating Homogeneous SMDM model output, namely, the Modified Weibull approach, which consists of a set of appropriate functions to describe the percentage condition prediction of bridge elements in each state. These functions are combined with Bayesian approach and Metropolis Hasting Algorithm (MHA) based Markov Chain Monte Carlo (MCMC) simulation technique for quantifying the uncertainty in model parameter estimates. In this study, factors contributing to rail bridge deterioration were identified. The inspection data for 1,000 Australian railway bridges over 15 years were reviewed and filtered accordingly based on the real operational experience. Network level deterioration model for a typical bridge element group was developed using the proposed Modified Weibull approach. The condition state predictions obtained from this method were validated using statistical hypothesis tests with a test data set. Results show that the proposed model is able to not only predict the conditions in network-level accurately but also capture the model uncertainties with given confidence interval.

Keywords: bridge deterioration modelling, modified weibull approach, MCMC, metropolis-hasting algorithm, bayesian approach, Markov deterioration models

Procedia PDF Downloads 693
89 Combustion Characteristics of Wet Woody Biomass in a Grate Furnace: Including Measurements within the Bed

Authors: Narges Razmjoo, Hamid Sefidari, Michael Strand

Abstract:

Biomass combustion is a growing technique for heat and power production due to the increasing stringent regulations with CO2 emissions. Grate-fired systems have been regarded as a common and popular combustion technology for burning woody biomass. However, some grate furnaces are not well optimized and may emit significant amount of unwanted compounds such as dust, NOx, CO, and unburned gaseous components. The combustion characteristics inside the fuel bed are of practical interest, as they are directly related to the release of volatiles and affect the stability and the efficiency of the fuel bed combustion. Although numerous studies have been presented on the grate firing of biomass, to the author’s knowledge, none of them have conducted a detailed experimental study within the fuel bed. It is difficult to conduct measurements of temperature and gas species inside the burning bed of the fuel in full-scale boilers. Results from such inside bed measurements can also be applied by the numerical experts for modeling the fuel bed combustion. The current work presents an experimental investigation into the combustion behavior of wet woody biomass (53 %) in a 4 MW reciprocating grate boiler, by focusing on the gas species distribution along the height of the fuel bed. The local concentrations of gases (CO, CO2, CH4, NO, and O2) inside the fuel bed were measured through a glass port situated on the side wall of the furnace. The measurements were carried out at five different heights of the fuel bed, by means of a bent stainless steel probe containing a type-k thermocouple. The sample gas extracted from the fuel bed, through the probe, was filtered and dried and then was analyzed using two infrared spectrometers. Temperatures of about 200-1100 °C were measured close to the grate, indicating that char combustion is occurring at the bottom of the fuel bed and propagates upward. The CO and CO2 concentration varied in the range of 15-35 vol % and 3-16 vol %, respectively, and NO concentration varied between 10-140 ppm. The profile of the gas concentrations distribution along the bed height provided a good overview of the combustion sub-processes in the fuel bed.

Keywords: experimental, fuel bed, grate firing, wood combustion

Procedia PDF Downloads 301
88 Vibro-Tactile Equalizer for Musical Energy-Valence Categorization

Authors: Dhanya Nair, Nicholas Mirchandani

Abstract:

Musical haptic systems can enhance a listener’s musical experience while providing an alternative platform for the hearing impaired to experience music. Current music tactile technologies focus on representing tactile metronomes to synchronize performers or encoding musical notes into distinguishable (albeit distracting) tactile patterns. There is growing interest in the development of musical haptic systems to augment the auditory experience, although the haptic-music relationship is still not well understood. This paper represents a tactile music interface that provides vibrations to multiple fingertips in synchronicity with auditory music. Like an audio equalizer, different frequency bands are filtered out, and the power in each frequency band is computed and converted to a corresponding vibrational strength. These vibrations are felt on different fingertips, each corresponding to a different frequency band. Songs with music from different spectrums, as classified by their energy and valence, were used to test the effectiveness of the system and to understand the relationship between music and tactile sensations. Three participants were trained on one song categorized as sad (low energy and low valence score) and one song categorized as happy (high energy and high valence score). They were trained both with and without auditory feedback (listening to the song while experiencing the tactile music on their fingertips and then experiencing the vibrations alone without the music). The participants were then tested on three songs from both categories, without any auditory feedback, and were asked to classify the tactile vibrations they felt into either category. The participants were blinded to the songs being tested and were not provided any feedback on the accuracy of their classification. These participants were able to classify the music with 100% accuracy. Although the songs tested were on two opposite spectrums (sad/happy), the preliminary results show the potential of utilizing a vibrotactile equalizer, like the one presented, for augmenting musical experience while furthering the current understanding of music tactile relationship.

Keywords: haptic music relationship, tactile equalizer, tactile music, vibrations and mood

Procedia PDF Downloads 146
87 Synthesis, Characterization, and Catalytic Application of Modified Hierarchical Zeolites

Authors: A. Feliczak Guzik, I. Nowak

Abstract:

Zeolites, classified as microporous materials, are a large group of crystalline aluminosilicate materials commonly used in the chemical industry. These materials are characterized by large specific surface area, high adsorption capacity, hydrothermal and thermal stability. However, the micropores present in them impose strong mass transfer limitations, resulting in low catalytic performance. Consequently, mesoporous (hierarchical) zeolites have attracted considerable attention from researchers. These materials possess additional porosity in the mesopore size region (2-50 nm according to IUPAC). Mesoporous zeolites, based on commercial MFI-type zeolites modified with silver, were synthesized as follows: 0.5 g of zeolite was dispersed in a mixture containing CTABr (template), water, ethanol, and ammonia under ultrasound for 30 min at 65°C. The silicon source, which was tetraethyl orthosilicate, was then added and stirred for 4 h. After this time, silver(I) nitrate was added. In a further step, the whole mixture was filtered and washed with water: ethanol mixture. The template was removed by calcination at 550°C for 5h. All the materials obtained were characterized by the following techniques: X-ray diffraction (XRD), transmission electron microscopy (TEM), scanning electron microscopy (SEM), nitrogen adsorption/desorption isotherms, FTIR spectroscopy. X-ray diffraction and low-temperature nitrogen adsorption/desorption isotherms revealed additional secondary porosity. Moreover, the structure of the commercial zeolite was preserved during most of the material syntheses. The aforementioned materials were used in the epoxidation reaction of cyclohexene using conventional heating and microwave radiation heating. The composition of the reaction mixture was analyzed every 1 h by gas chromatography. As a result, about 60% conversion of cyclohexene and high selectivity to the desired reaction products i.e., 1,2-epoxy cyclohexane and 1,2-cyclohexane diol, were obtained.

Keywords: catalytic application, characterization, epoxidation, hierarchical zeolites, synthesis

Procedia PDF Downloads 63
86 Improving Search Engine Performance by Removing Indexes to Malicious URLs

Authors: Durga Toshniwal, Lokesh Agrawal

Abstract:

As the web continues to play an increasing role in information exchange, and conducting daily activities, computer users have become the target of miscreants which infects hosts with malware or adware for financial gains. Unfortunately, even a single visit to compromised web site enables the attacker to detect vulnerabilities in the user’s applications and force the downloading of multitude of malware binaries. We provide an approach to effectively scan the so-called drive-by downloads on the Internet. Drive-by downloads are result of URLs that attempt to exploit their visitors and cause malware to be installed and run automatically. To scan the web for malicious pages, the first step is to use a crawler to collect URLs that live on the Internet, and then to apply fast prefiltering techniques to reduce the amount of pages that are needed to be examined by precise, but slower, analysis tools (such as honey clients or antivirus programs). Although the technique is effective, it requires a substantial amount of resources. A main reason is that the crawler encounters many pages on the web that are legitimate and needs to be filtered. In this paper, to characterize the nature of this rising threat, we present implementation of a web crawler on Python, an approach to search the web more efficiently for pages that are likely to be malicious, filtering benign pages and passing remaining pages to antivirus program for detection of malwares. Our approaches starts from an initial seed of known, malicious web pages. Using these seeds, our system generates search engines queries to identify other malicious pages that are similar to the ones in the initial seed. By doing so, it leverages the crawling infrastructure of search engines to retrieve URLs that are much more likely to be malicious than a random page on the web. The results shows that this guided approach is able to identify malicious web pages more efficiently when compared to random crawling-based approaches.

Keywords: web crawler, malwares, seeds, drive-by-downloads, security

Procedia PDF Downloads 208
85 Use of WhatsApp Messenger for Optimal Healthcare Operational Communication during the COVID-19 Pandemic

Authors: Josiah O. Carter, Charlotte Hayden, Elizabeth Arthurs

Abstract:

Background: During the COVID-19 pandemic, hospital management policies have changed frequently and rapidly. This has created novel challenges in keeping the workforce abreast of these changes to enable them to deliver safe and effective care. Traditional communication methods, e.g. email, do not keep pace with the rapidly changing environment in the hospital, resulting in inaccurate, irrelevant, or outdated information being communicated, resulting in inefficiencies in patient care. Methods: The creation of a WhatsApp messaging group within the medical division at the Bristol Royal Infirmary has enabled senior clinicians and the hospital management team to update the medical workforce in real-time. It has two primary functions: (1) To enable dissemination of a concise, important operational summary. This comprises information on bed status and infection control procedural changes. It is fed directly from a daily critical incident briefing (2) To facilitate a monthly scheduled question and answer (Q&A) session for junior doctors to clarify issues with clinical directors, rota, and management staff. Additional ad-hoc updates are sent out for time-critical information; otherwise, it mainly functions as a broadcast-only group to prevent important information from being lost amongst other communication. All junior doctors within the medical division were invited to join the group. At present, the group comprises 131 participants, of which 10 are administrative staff (rota coordinators, management staff & clinical directors); the remaining 121 are junior clinicians working within the medical division. An electronic survey via Microsoft forms was sent out to junior doctors via the WhatsApp group and via email to assess its utilisation and effectiveness with the aim of quality improvements. Results: Of the 121 group participants, 19 completed the questionnaire (response rate 15.7%). Of these, 16/19 (84.2%) used it regularly, and 12/19 (63.2%) rated it as the most useful source for reliable updates relating to the hospital response to the COVID-19 pandemic, whereas only 2/19 (10.5%) found the trust intranet and the trust COVID-19 operational email update most useful. Respondents rated the WhatsApp group more useful as an information source (mean score 7.7/10) than as a means of providing feedback to management staff (mean score 6.3/10). Qualitative feedback suggested information around ward closures and changes to COVID cohorting, along with updates on staffing issues, were most useful. Respondents also noted the Q&A sessions were an efficient way of relaying feedback about management decisions but that it would be preferable if these sessions could be delivered more frequently. Discussion: During the current global COVID-19 pandemic, there is an increased need for rapid dissemination of critical information within NHS trusts; this includes communication between junior doctors, managers, and senior clinicians. The versatility of WhatsApp permits a variety of functions allowing for regular updates, the dissemination of time-critical information, and enables conversing and feedback. The project has demonstrated that reserved and well-managed use of a WhatsApp group is a welcome, efficient and practical means of communication between the senior management team and the junior medical workforce.

Keywords: communication, COVID-19, hospital management, WhatsApp

Procedia PDF Downloads 85
84 In-House Enzyme Blends from Polyporus ciliatus CBS 366.74 for Enzymatic Saccharification of Pretreated Corn Stover

Authors: Joseph A. Bentil, Anders Thygesen, Lene Langea, Moses Mensah, Anne Meyer

Abstract:

The study investigated the saccharification potential of in-house enzymes produced from a white-rot basidiomycete strain, Polyporus ciliatus CBS 366.74. The in-house enzymes were produced by growing the fungus on mono and composite substrates of cocoa pod husk (CPH) and green seaweed (GS) (Ulva lactuca sp.) with and without the addition of 25mM ammonium nitrate at 4%w/v substrate concentration in submerged condition for 144 hours. The crude enzyme extracts preparations (CEE 1-5 and CEE 1-5+AN) obtained from the fungal cultivation process were sterile-filtered and used as enzyme sources for enzymatic hydrolysis of hydrothermally pretreated corn stover using a commercial cocktail enzyme, Cellic Ctec3, as benchmark. The hydrolysis was conducted at 50ᵒC with 50mM sodium acetate buffer, pH 5 based on enzyme dosages of 5 and 10 CMCase Units/g biomass at 1%w/v dry weight substrate concentration at time points of 6, 24, and 72 hours. The enzyme activity profile of the in-house enzymes varied among the growth substrates with the composite substrates (50-75% GS and AN inclusion), yielding better enzyme activities, especially endoglucanases (0.4-0.5U/mL), β-glucosidases (0.1-0.2 U/mL), and xylanases (3-10 U/mL). However, nitrogen supplementation had no significant effect on enzyme activities of crude extracts from 100% GS substituted substrates. From the enzymatic hydrolysis, it was observed that the in-house enzymes were capable of hydrolysing the pretreated corn stover at varying degrees; however, the saccharification yield was less than 10%. Consequently, theoretical glucose yield was ten times lower than Cellic Ctec3 at both dosage levels. There was no linear correlation between glucose yield and enzyme dosage for the in-house enzymes, unlike the benchmark enzyme. It is therefore recommended that the in-house enzymes are used to complement the dosage of commercial enzymes to reduce the cost of biomass saccharification.

Keywords: enzyme production, hydrolysis yield, feedstock, enzyme blend, Polyporus ciliatus

Procedia PDF Downloads 227
83 Whole Exome Sequencing Data Analysis of Rare Diseases: Non-Coding Variants and Copy Number Variations

Authors: S. Fahiminiya, J. Nadaf, F. Rauch, L. Jerome-Majewska, J. Majewski

Abstract:

Background: Sequencing of protein coding regions of human genome (Whole Exome Sequencing; WES), has demonstrated a great success in the identification of causal mutations for several rare genetic disorders in human. Generally, most of WES studies have focused on rare variants in coding exons and splicing-sites where missense substitutions lead to the alternation of protein product. Although focusing on this category of variants has revealed the mystery behind many inherited genetic diseases in recent years, a subset of them remained still inconclusive. Here, we present the result of our WES studies where analyzing only rare variants in coding regions was not conclusive but further investigation revealed the involvement of non-coding variants and copy number variations (CNV) in etiology of the diseases. Methods: Whole exome sequencing was performed using our standard protocols at Genome Quebec Innovation Center, Montreal, Canada. All bioinformatics analyses were done using in-house WES pipeline. Results: To date, we successfully identified several disease causing mutations within gene coding regions (e.g. SCARF2: Van den Ende-Gupta syndrome and SNAP29: 22q11.2 deletion syndrome) by using WES. In addition, we showed that variants in non-coding regions and CNV have also important value and should not be ignored and/or filtered out along the way of bioinformatics analysis on WES data. For instance, in patients with osteogenesis imperfecta type V and in patients with glucocorticoid deficiency, we identified variants in 5'UTR, resulting in the production of longer or truncating non-functional proteins. Furthermore, CNVs were identified as the main cause of the diseases in patients with metaphyseal dysplasia with maxillary hypoplasia and brachydactyly and in patients with osteogenesis imperfecta type VII. Conclusions: Our study highlights the importance of considering non-coding variants and CNVs during interpretation of WES data, as they can be the only cause of disease under investigation.

Keywords: whole exome sequencing data, non-coding variants, copy number variations, rare diseases

Procedia PDF Downloads 389
82 Detection of Cryptosporidium Oocysts by Acid-Fast Staining Method and PCR in Surface Water from Tehran, Iran

Authors: Mohamad Mohsen Homayouni, Niloofar Taghipour, Ahmad Reza Memar, Niloofar Khalaji, Hamed Kiani, Seyyed Javad Seyyed Tabaei

Abstract:

Background and Objective: Cryptosporidium is a coccidian protozoan parasite; its oocysts in surface water are a global health problem. Due to the low number of parasites in the water resources and the lack of laboratory culture, rapid and sensitive method for detection of the organism in the water resources is necessarily required. We applied modified acid-fast staining and PCR for the detection of the Cryptosporidium spp. and analysed the genotypes in 55 samples collected from surface water. Methods: Over a period of nine months, 55 surface water samples were collected from the five rivers in Tehran, Iran. The samples were filtered by using cellulose acetate membrane filters. By acid fast method, initial identification of Cryptosporidium oocyst were carried out on surface water samples. Then, nested PCR assay was designed for the specific amplification and analysed the genotypes. Results: Modified Ziehl-Neelsen method revealed 5–20 Cryptosporidium oocysts detected per 10 Liter. Five out of the 55 (9.09%) surface water samples were found positive for Cryptosporidium spp. by Ziehl-Neelsen test and seven (12.7%) were found positive by nested PCR. The staining results were consistent with PCR. Seven Cryptosporidium PCR products were successfully sequenced and five gp60 subtypes were detected. Our finding of gp60 gene revealed that all of the positive isolates were Cryptosporidium parvum and belonged to subtype families IIa and IId. Conclusion: Our investigations were showed that collection of water samples were contaminated by Cryptosporidium, with potential hazards for the significant health problem. This study provides the first report on detection and genotyping of Cryptosporidium species from surface water samples in Iran, and its result confirmed the low clinical incidence of this parasite on the community.

Keywords: Cryptosporidium spp., membrane filtration, subtype, surface water, Iran

Procedia PDF Downloads 379
81 Evaluation of Health Risk Degree Arising from Heavy Metals Present in Drinking Water

Authors: Alma Shehu, Majlinda Vasjari, Sonila Duka, Loreta Vallja, Nevila Broli

Abstract:

Humans consume drinking water from several sources, including tap water, bottled water, natural springs, filtered tap water, etc. The quality of drinking water is crucial for human survival given the fact that the consumption of contaminated drinking water is related to many diseases and deaths all over the world. This study represents the investigation of the quality and health risks of different types of drinking waters being consumed by the population in Albania, arising from heavy metals content. Investigated water included industrialized water, tap water, and spring water. In total, 20 samples were analyzed for the content of Pb, Cd, Cr, Ni, Cu, Fe, Zn, Al, and Mn. Determination of each metal concentration in selected samples was conducted by atomic absorption spectroscopy method with electrothermal atomization, GFAAS. Water quality was evaluated by comparing the obtained metals concentrations with the recommended maximum limits, according to the European Directive (98/83/EC) and Guidelines for Drinking Water Quality (WHO, 2017). Metal Index (MI) was used to assess the overall water quality due to heavy metals content. Health risk assessment was conducted based on the recommendations of the USEPA (1996), human health risk assessment, via ingestion. Results of this investigation showed that Al, Ni, Fe, and Cu were the metals found in higher concentrations while Cd exhibited the lowest concentration. Among the analyzed metals, Al (one sample) and Ni (in five samples) exceeded the maximum allowed limit. Based on the pollution metal index, it was concluded that the overall quality of Glina bottled water can be considered as toxic to humans, while the quality of bottled water (Trebeshina) was classified as moderately toxic. Values of health risk quotient (HQ) varied between 1x10⁻⁶-1.3x10⁻¹, following the order Ni > Cd > Pb > Cu > Al > Fe > Zn > Mn. All the values were lower than 1, which suggests that the analyzed samples exhibit no health risk for humans.

Keywords: drinking water, health risk assessment, heavy metals, pollution index

Procedia PDF Downloads 105
80 The Use of TRIZ to Map the Evolutive Pattern of Products

Authors: Fernando C. Labouriau, Ricardo M. Naveiro

Abstract:

This paper presents a model for mapping the evolutive pattern of products in order to generate new ideas, to perceive emerging technologies and to manage product’s portfolios in new product development (NPD). According to the proposed model, the information extracted from the patent system is filtered and analyzed with TRIZ tools to produce the input information to the NPD process. The authors acknowledge that the NPD process is well integrated within the enterprises business strategic planning and that new products are vital in the competitive market nowadays. In the other hand, it has been observed the proactive use of patent information in some methodologies for selecting projects, mapping technological change and generating product concepts. And one of these methodologies is TRIZ, a theory created to favor innovation and to improve product design that provided the analytical framework for the model. Initially, it is presented an introduction to TRIZ mainly focused on the patterns of evolution of technical systems and its strategic uses, a brief and absolutely non-comprehensive description as the theory has several others tools being widely employed in technical and business applications. Then, it is introduced the model for mapping the products evolutive pattern with its three basic pillars, namely patent information, TRIZ and NPD, and the methodology for implementation. Following, a case study of a Brazilian bike manufacturing is presented to proceed the mapping of a product evolutive pattern by decomposing and analyzing one of its assemblies along ten evolution lines in order to envision opportunities for further product development. Some of these lines are illustrated in more details to evaluate the features of the product in relation to the TRIZ concepts using a comparison perspective with patents in the state of the art to validate the product’s evolutionary potential. As a result, the case study provided several opportunities for a product improvement development program in different project categories, identifying technical and business impacts as well as indicating the lines of evolution that can mostly benefit from each opportunity.

Keywords: product development, patents, product strategy, systems evolution

Procedia PDF Downloads 470