Search results for: transition probability density
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6170

Search results for: transition probability density

3950 Mathematical Toolbox for editing Equations and Geometrical Diagrams and Graphs

Authors: Ayola D. N. Jayamaha, Gihan V. Dias, Surangika Ranathunga

Abstract:

Currently there are lot of educational tools designed for mathematics. Open source software such as GeoGebra and Octave are bulky in their architectural structure. In addition, there is MathLab software, which facilitates much more than what we ask for. Many of the computer aided online grading and assessment tools require integrating editors to their software. However, there are not exist suitable editors that cater for all their needs in editing equations and geometrical diagrams and graphs. Some of the existing software for editing equations is Alfred’s Equation Editor, Codecogs, DragMath, Maple, MathDox, MathJax, MathMagic, MathFlow, Math-o-mir, Microsoft Equation Editor, MiraiMath, OpenOffice, WIRIS Editor and MyScript. Some of them are commercial, open source, supports handwriting recognition, mobile apps, renders MathML/LaTeX, Flash / Web based and javascript display engines. Some of the diagram editors are GeoKone.NET, Tabulae, Cinderella 1.4, MyScript, Dia, Draw2D touch, Gliffy, GeoGebra, Flowchart, Jgraph, JointJS, J painter Online diagram editor and 2D sketcher. All these software are open source except for MyScript and can be used for editing mathematical diagrams. However, they do not fully cater the needs of a typical computer aided assessment tool or Educational Platform for Mathematics. This solution provides a Web based, lightweight, easy to implement and integrate solution of an html5 canvas that renders on all of the modern web browsers. The scope of the project is an editor that covers equations and mathematical diagrams and drawings on the O/L Mathematical Exam Papers in Sri Lanka. Using the tool the students can enter any equation to the system which can be on an online remote learning platform. The users can also create and edit geometrical drawings, graphs and do geometrical constructions that require only Compass and Ruler from the Editing Interface provided by the Software. The special feature of this software is the geometrical constructions. It allows the users to create geometrical constructions such as angle bisectors, perpendicular lines, angles of 600 and perpendicular bisectors. The tool correctly imitates the functioning of rulers and compasses to create the required geometrical construction. Therefore, the users are able to do geometrical drawings on the computer successfully and we have a digital format of the geometrical drawing for further processing. Secondly, we can create and edit Venn Diagrams, color them and label them. In addition, the students can draw probability tree diagrams and compound probability outcome grids. They can label and mark regions within the grids. Thirdly, students can draw graphs (1st order and 2nd order). They can mark points on a graph paper and the system connects the dots to draw the graph. Further students are able to draw standard shapes such as circles and rectangles by selecting points on a grid or entering the parametric values.

Keywords: geometrical drawings, html5 canvas, mathematical equations, toolbox

Procedia PDF Downloads 376
3949 Identifying and Quantifying Factors Affecting Traffic Crash Severity under Heterogeneous Traffic Flow

Authors: Praveen Vayalamkuzhi, Veeraragavan Amirthalingam

Abstract:

Studies on safety on highways are becoming the need of the hour as over 400 lives are lost every day in India due to road crashes. In order to evaluate the factors that lead to different levels of crash severity, it is necessary to investigate the level of safety of highways and their relation to crashes. In the present study, an attempt is made to identify the factors that contribute to road crashes and to quantify their effect on the severity of road crashes. The study was carried out on a four-lane divided rural highway in India. The variables considered in the analysis includes components of horizontal alignment of highway, viz., straight or curve section; time of day, driveway density, presence of median; median opening; gradient; operating speed; and annual average daily traffic. These variables were considered after a preliminary analysis. The major complexities in the study are the heterogeneous traffic and the speed variation between different classes of vehicles along the highway. To quantify the impact of each of these factors, statistical analyses were carried out using Logit model and also negative binomial regression. The output from the statistical models proved that the variables viz., horizontal components of the highway alignment; driveway density; time of day; operating speed as well as annual average daily traffic show significant relation with the severity of crashes viz., fatal as well as injury crashes. Further, the annual average daily traffic has significant effect on the severity compared to other variables. The contribution of highway horizontal components on crash severity is also significant. Logit models can predict crashes better than the negative binomial regression models. The results of the study will help the transport planners to look into these aspects at the planning stage itself in the case of highways operated under heterogeneous traffic flow condition.

Keywords: geometric design, heterogeneous traffic, road crash, statistical analysis, level of safety

Procedia PDF Downloads 302
3948 3D-Mesh Robust Watermarking Technique for Ownership Protection and Authentication

Authors: Farhan A. Alenizi

Abstract:

Digital watermarking has evolved in the past years as an important means for data authentication and ownership protection. The images and video watermarking was well known in the field of multimedia processing; however, 3D objects' watermarking techniques have emerged as an important means for the same purposes, as 3D mesh models are in increasing use in different areas of scientific, industrial, and medical applications. Like the image watermarking techniques, 3D watermarking can take place in either space or transform domains. Unlike images and video watermarking, where the frames have regular structures in both space and temporal domains, 3D objects are represented in different ways as meshes that are basically irregular samplings of surfaces; moreover, meshes can undergo a large variety of alterations which may be hard to tackle. This makes the watermarking process more challenging. While the transform domain watermarking is preferable in images and videos, they are still difficult to implement in 3d meshes due to the huge number of vertices involved and the complicated topology and geometry, and hence the difficulty to perform the spectral decomposition, even though significant work was done in the field. Spatial domain watermarking has attracted significant attention in the past years; they can either act on the topology or on the geometry of the model. Exploiting the statistical characteristics in the 3D mesh models from both geometrical and topological aspects was useful in hiding data. However, doing that with minimal surface distortions to the mesh attracted significant research in the field. A 3D mesh blind watermarking technique is proposed in this research. The watermarking method depends on modifying the vertices' positions with respect to the center of the object. An optimal method will be developed to reduce the errors, minimizing the distortions that the 3d object may experience due to the watermarking process, and reducing the computational complexity due to the iterations and other factors. The technique relies on the displacement process of the vertices' locations depending on the modification of the variances of the vertices’ norms. Statistical analyses were performed to establish the proper distributions that best fit each mesh, and hence establishing the bins sizes. Several optimizing approaches were introduced in the realms of mesh local roughness, the statistical distributions of the norms, and the displacements in the mesh centers. To evaluate the algorithm's robustness against other common geometry and connectivity attacks, the watermarked objects were subjected to uniform noise, Laplacian smoothing, vertices quantization, simplification, and cropping. Experimental results showed that the approach is robust in terms of both perceptual and quantitative qualities. It was also robust against both geometry and connectivity attacks. Moreover, the probability of true positive detection versus the probability of false-positive detection was evaluated. To validate the accuracy of the test cases, the receiver operating characteristics (ROC) curves were drawn, and they’ve shown robustness from this aspect. 3D watermarking is still a new field but still a promising one.

Keywords: watermarking, mesh objects, local roughness, Laplacian Smoothing

Procedia PDF Downloads 160
3947 Uneven Development: Structural Changes and Income Outcomes across States in Malaysia

Authors: Siti Aiysyah Tumin

Abstract:

This paper looks at the nature of structural changes—the transition of employment from agriculture, to manufacturing, then to different types of services—in different states in Malaysia and links it to income outcomes for households and workers. Specifically, this paper investigates the conditional association between the concentration of different economic activities and income outcomes (household incomes and employee wages) in almost four decades. Using publicly available state-level employment and income data, we found that significant wage premium was associated with “modern” services (finance, real estate, professional, information and communication), which are urban-based services sectors that employ a larger proportion of skilled and educated workers. However, employment in manufacturing and other services subsectors was significantly associated with a lower income dispersion and inequality, alluding to their importance in welfare improvements.

Keywords: employment, labor market, structural change, wage

Procedia PDF Downloads 169
3946 Series Connected GaN Resonant Tunneling Diodes for Multiple-Valued Logic

Authors: Fang Liu, JunShuai Xue, JiaJia Yao, XueYan Yang, ZuMao Li, GuanLin Wu, HePeng Zhang, ZhiPeng Sun

Abstract:

III-Nitride resonant tunneling diode (RTD) is one of the most promising candidates for multiple-valued logic (MVL) elements. Here, we report a monolithic integration of GaN resonant tunneling diodes to realize multiple negative differential resistance (NDR) regions for MVL application. GaN RTDs, composed of a 2 nm quantum well embedded in two 1 nm quantum barriers, are grown by plasma-assisted molecular beam epitaxy on free-standing c-plane GaN substrates. Negative differential resistance characteristic with a peak current density of 178 kA/cm² in conjunction with a peak-to-valley current ratio (PVCR) of 2.07 is observed. Statistical properties exhibit high consistency showing a peak current density standard deviation of almost 1%, laying the foundation for the monolithic integration. After complete electrical isolation, two diodes of the designed same area are connected in series. By solving the Poisson equation and Schrodinger equation in one dimension, the energy band structure is calculated to explain the transport mechanism of the differential negative resistance phenomenon. Resonant tunneling events in a sequence of the series-connected RTD pair (SCRTD) form multiple NDR regions with nearly equal peak current, obtaining three stable operating states corresponding to ternary logic. A frequency multiplier circuit achieved using this integration is demonstrated, attesting to the robustness of this multiple peaks feature. This article presents a monolithic integration of SCRTD with multiple NDR regions driven by the resonant tunneling mechanism, which can be applied to a multiple-valued logic field, promising a fast operation speed and a great reduction of circuit complexity and demonstrating a new solution for nitride devices to break through the limitations of binary logic.

Keywords: GaN resonant tunneling diode, multiple-valued logic system, frequency multiplier, negative differential resistance, peak-to-valley current ratio

Procedia PDF Downloads 81
3945 An Algorithm for Determining the Arrival Behavior of a Secondary User to a Base Station in Cognitive Radio Networks

Authors: Danilo López, Edwin Rivas, Leyla López

Abstract:

This paper presents the development of an algorithm that predicts the arrival of a secondary user (SU) to a base station (BS) in a cognitive network based on infrastructure, requesting a Best Effort (BE) or Real Time (RT) type of service with a determined bandwidth (BW) implementing neural networks. The algorithm dynamically uses a neural network construction technique using the geometric pyramid topology and trains a Multilayer Perceptron Neural Networks (MLPNN) based on the historical arrival of an SU to estimate future applications. This will allow efficiently managing the information in the BS, since it precedes the arrival of the SUs in the stage of selection of the best channel in CRN. As a result, the software application determines the probability of arrival at a future time point and calculates the performance metrics to measure the effectiveness of the predictions made.

Keywords: cognitive radio, base station, best effort, MLPNN, prediction, real time

Procedia PDF Downloads 330
3944 Application of Seismic Refraction Method in Geotechnical Study

Authors: Abdalla Mohamed M. Musbahi

Abstract:

The study area lies in Al-Falah area on Airport-Tripoli in Zone (16) Where planned establishment of complex multi-floors for residential and commercial, this part was divided into seven subzone. In each sup zone, were collected Orthogonal profiles by using Seismic refraction method. The overall aim with this project is to investigate the applicability of Seismic refraction method is a commonly used traditional geophysical technique to determine depth-to-bedrock, competence of bedrock, depth to the water table, or depth to other seismic velocity boundaries The purpose of the work is to make engineers and decision makers recognize the importance of planning and execution of a pre-investigation program including geophysics and in particular seismic refraction method. The overall aim with this thesis is achieved by evaluation of seismic refraction method in different scales, determine the depth and velocity of the base layer (bed-rock). Calculate the elastic property in each layer in the region by using the Seismic refraction method. The orthogonal profiles was carried out in every subzones of (zone 16). The layout of the seismic refraction set up is schematically, the geophones are placed on the linear imaginary line whit a 5 m spacing, the three shot points (in beginning of layout–mid and end of layout) was used, in order to generate the P and S waves. The 1st and last shot point is placed about 5 meters from the geophones and the middle shot point is put in between 12th to 13th geophone, from time-distance curve the P and S waves was calculated and the thickness was estimated up to three-layers. As we know any change in values of physical properties of medium (shear modulus, bulk modulus, density) leads to change waves velocity which passing through medium where any change in properties of rocks cause change in velocity of waves. because the change in properties of rocks cause change in parameters of medium density (ρ), bulk modulus (κ), shear modulus (μ). Therefore, the velocity of waves which travel in rocks have close relationship with these parameters. Therefore we can estimate theses parameters by knowing primary and secondary velocity (p-wave, s-wave).

Keywords: application of seismic, geotechnical study, physical properties, seismic refraction

Procedia PDF Downloads 491
3943 A Novel Probabilistic Spatial Locality of Reference Technique for Automatic Cleansing of Digital Maps

Authors: A. Abdullah, S. Abushalmat, A. Bakshwain, A. Basuhail, A. Aslam

Abstract:

GIS (Geographic Information System) applications require geo-referenced data, this data could be available as databases or in the form of digital or hard-copy agro-meteorological maps. These parameter maps are color-coded with different regions corresponding to different parameter values, converting these maps into a database is not very difficult. However, text and different planimetric elements overlaid on these maps makes an accurate image to database conversion a challenging problem. The reason being, it is almost impossible to exactly replace what was underneath the text or icons; thus, pointing to the need for inpainting. In this paper, we propose a probabilistic inpainting approach that uses the probability of spatial locality of colors in the map for replacing overlaid elements with underlying color. We tested the limits of our proposed technique using non-textual simulated data and compared text removing results with a popular image editing tool using public domain data with promising results.

Keywords: noise, image, GIS, digital map, inpainting

Procedia PDF Downloads 352
3942 Analysis and Rule Extraction of Coronary Artery Disease Data Using Data Mining

Authors: Rezaei Hachesu Peyman, Oliyaee Azadeh, Salahzadeh Zahra, Alizadeh Somayyeh, Safaei Naser

Abstract:

Coronary Artery Disease (CAD) is one major cause of disability in adults and one main cause of death in developed. In this study, data mining techniques including Decision Trees, Artificial neural networks (ANNs), and Support Vector Machine (SVM) analyze CAD data. Data of 4948 patients who had suffered from heart diseases were included in the analysis. CAD is the target variable, and 24 inputs or predictor variables are used for the classification. The performance of these techniques is compared in terms of sensitivity, specificity, and accuracy. The most significant factor influencing CAD is chest pain. Elderly males (age > 53) have a high probability to be diagnosed with CAD. SVM algorithm is the most useful way for evaluation and prediction of CAD patients as compared to non-CAD ones. Application of data mining techniques in analyzing coronary artery diseases is a good method for investigating the existing relationships between variables.

Keywords: classification, coronary artery disease, data-mining, knowledge discovery, extract

Procedia PDF Downloads 657
3941 Solid Polymer Electrolyte Membranes Based on Siloxane Matrix

Authors: Natia Jalagonia, Tinatin Kuchukhidze

Abstract:

Polymer electrolytes (PE) play an important part in electrochemical devices such as batteries and fuel cells. To achieve optimal performance, the PE must maintain a high ionic conductivity and mechanical stability at both high and low relative humidity. The polymer electrolyte also needs to have excellent chemical stability for long and robustness. According to the prevailing theory, ionic conduction in polymer electrolytes is facilitated by the large-scale segmental motion of the polymer backbone, and primarily occurs in the amorphous regions of the polymer electrolyte. Crystallinity restricts polymer backbone segmental motion and significantly reduces conductivity. Consequently, polymer electrolytes with high conductivity at room temperature have been sought through polymers which have highly flexible backbones and have largely amorphous morphology. The interest in polymer electrolytes was increased also by potential applications of solid polymer electrolytes in high energy density solid state batteries, gas sensors and electrochromic windows. Conductivity of 10-3 S/cm is commonly regarded as a necessary minimum value for practical applications in batteries. At present, polyethylene oxide (PEO)-based systems are most thoroughly investigated, reaching room temperature conductivities of 10-7 S/cm in some cross-linked salt in polymer systems based on amorphous PEO-polypropylene oxide copolymers.. It is widely accepted that amorphous polymers with low glass transition temperatures Tg and a high segmental mobility are important prerequisites for high ionic conductivities. Another necessary condition for high ionic conductivity is a high salt solubility in the polymer, which is most often achieved by donors such as ether oxygen or imide groups on the main chain or on the side groups of the PE. It is well established also that lithium ion coordination takes place predominantly in the amorphous domain, and that the segmental mobility of the polymer is an important factor in determining the ionic mobility. Great attention was pointed to PEO-based amorphous electrolyte obtained by synthesis of comb-like polymers, by attaching short ethylene oxide unit sequences to an existing amorphous polymer backbone. The aim of presented work is to obtain of solid polymer electrolyte membranes using PMHS as a matrix. For this purpose the hydrosilylation reactions of α,ω-bis(trimethylsiloxy)methyl¬hydrosiloxane with allyl triethylene-glycol mo¬nomethyl ether and vinyltriethoxysilane at 1:28:7 ratio of initial com¬pounds in the presence of Karstedt’s catalyst, platinum hydrochloric acid (0.1 M solution in THF) and platinum on the carbon catalyst in 50% solution of anhydrous toluene have been studied. The synthesized olygomers are vitreous liquid products, which are well soluble in organic solvents with specific viscosity ηsp ≈ 0.05 - 0.06. The synthesized olygomers were analysed with FTIR, 1H, 13C, 29Si NMR spectroscopy. Synthesized polysiloxanes were investigated with wide-angle X-ray, gel-permeation chromatography, and DSC analyses. Via sol-gel processes of doped with lithium trifluoromethylsulfonate (triflate) or lithium bis¬(trifluoromethylsulfonyl)¬imide polymer systems solid polymer electrolyte membranes have been obtained. The dependence of ionic conductivity as a function of temperature and salt concentration was investigated and the activation energies of conductivity for all obtained compounds are calculated

Keywords: synthesis, PMHS, membrane, electrolyte

Procedia PDF Downloads 257
3940 Green Synthesis (Using Environment Friendly Bacteria) of Silver-Nanoparticles and Their Application as Drug Delivery Agents

Authors: Sutapa Mondal Roy, Suban K. Sahoo

Abstract:

The primary aim of this work is to synthesis silver nanoparticles (AgNPs) through environmentally benign routes to avoid any chemical toxicity related undesired side effects. The nanoparticles were stabilized with drug ciprofloxacin (Cp) and were studied for their effectiveness as drug delivery agent. Targeted drug delivery improves the therapeutic potential of drugs at the diseased site as well as lowers the overall dose and undesired side effects. The small size of nanoparticles greatly facilitates the transport of active agents (drugs) across biological membranes and allows them to pass through the smallest capillaries in the body that are 5-6 μm in diameter, and can minimize possible undesired side effects. AgNPs are non-toxic, inert, stable, and has a high binding capacity and thus can be considered as biomaterials. AgNPs were synthesized from the nutrient broth supernatant after the culture of environment-friendly bacteria Bacillus subtilis. The AgNPs were found to show the surface plasmon resonance (SPR) band at 425 nm. The Cp capped Ag nanoparticles formation was complete within 30 minutes, which was confirmed from absorbance spectroscopy. Physico-chemical nature of the AgNPs-Cp system was confirmed by Dynamic Light Scattering (DLS), Transmission Electron Microscopy (TEM) etc. The AgNPs-Cp system size was found to be in the range of 30-40 nm. To monitor the kinetics of drug release from the surface of nanoparticles, the release of Cp was carried out by careful dialysis keeping AgNPs-Cp system inside the dialysis bag at pH 7.4 over time. The drug release was almost complete after 30 hrs. During the drug delivery process, to understand the AgNPs-Cp system in a better way, the sincere theoretical investigation is been performed employing Density Functional Theory. Electronic charge transfer, electron density, binding energy as well as thermodynamic properties like enthalpy, entropy, Gibbs free energy etc. has been predicted. The electronic and thermodynamic properties, governed by the AgNPs-Cp interactions, indicate that the formation of AgNPs-Cp system is exothermic i.e. thermodynamically favorable process. The binding energy and charge transfer analysis implies the optimum stability of the AgNPs-Cp system. Thus, the synthesized Cp-Ag nanoparticles can be effectively used for biological purposes due to its environmentally benign routes of synthesis procedures, which is clean, biocompatible, non-toxic, safe, cost-effective, sustainable and eco-friendly. The Cp-AgNPs as biomaterials can be successfully used for drug delivery procedures due to slow release of drug from nanoparticles over a considerable period of time. The kinetics of the drug release show that this drug-nanoparticle assembly can be effectively used as potential tools for therapeutic applications. The ease of synthetic procedure, lack of possible chemical toxicity and their biological activity along with excellent application as drug delivery agent will open up vista of using nanoparticles as effective and successful drug delivery agent to be used in modern days.

Keywords: silver nanoparticles, ciprofloxacin, density functional theory, drug delivery

Procedia PDF Downloads 384
3939 Analysis of the Interests, Conflicts and Power Resources in the Urban Development in the Megacity of Sao Paulo

Authors: A. G. Back

Abstract:

Urban planning is a relevant tool to address, in a systemic way, several sectoral policies capable of linking the urban agenda with the reduction of socio-environmental risks. The Sao Paulo’s master plan (2014) presents innovations capable of promoting the transition to sustainability in the urban space, with a view to its regulatory instruments related to i) promotion of density in the axes of mass transport involving the mixture of commercial, residential, services, and leisure uses (principles related to the compact city); ii) vulnerabilities reduction based on housing policies including regular sources of funds for social housing and land reservation in urbanized areas; iii) reserve of green areas in the city to create parks and environmental regulations for new buildings focused on reducing the effects of heat island and improving urban drainage. However, its long-term implementation involves distributive conflicts and can undergo changes in different political, economic, and social contexts over time. Thus, the main objective of this paper is to identify and analyze the dynamics of conflicts of interest between social groups in the implementation of Sao Paulo’s urban development policy, particularly in relation to recent attempts at a (re) interpretation of the Master Plan guidelines, in view of the proposals for revision of the urban zoning law. In this sense, we seek to identify the demands, narratives of urban actors, including the real estate market, middle-class neighborhood associations ('not in my backyard' movements), and social housing rights movements. And we seek to analyze the power resources that these actors mobilize to influence the decision-making process, involving five categories: social capital, political access; discursive resource; media, juridical resource. The major findings of this research suggest that the interests and demands of the real estate market do not always prevail in urban regulation. After all, other actors also press for the definition of urban law with interests opposite to those of the real estate market. This is the case of associations of middle-class neighborhoods, which work to protect the characteristics of the locality, acting, in general, to prevent constructive and population densification in neighborhoods well located near the center, in São Paulo. One of the main demands of these “not in my backyard” movements is the delimitation of exclusively residential areas in the central region of the city, which is not only contrary to the interests of the real state market but also contrary to the principles of the compact city. On the other hand, social housing rights movements have also made progress in delimiting special areas of social interest in well-located and valued areas in the city dedicated to building social housing, also contrary to the interests of the real estate market. An urban development that follows the principles of the compact city must take into account the insertion of low-income populations in well-located regions; otherwise, such a development model may continue to push the less favored to the peripheries towards the preservation areas and/or risk areas.

Keywords: interest groups, Sao Paulo, sustainable urban development, urban policies implementation

Procedia PDF Downloads 110
3938 Developing Offshore Energy Grids in Norway as Capability Platforms

Authors: Vidar Hepsø

Abstract:

The energy and oil companies on the Norwegian Continental shelf come from a situation where each asset control and manage their energy supply (island mode) and move towards a situation where the assets need to collaborate and coordinate energy use with others due to increased cost and scarcity of electric energy sharing the energy that is provided. Currently, several areas are electrified either with an onshore grid cable or are receiving intermittent energy from offshore wind-parks. While the onshore grid in Norway is well regulated, the offshore grid is still in the making, with several oil and gas electrification projects and offshore wind development just started. The paper will describe the shift in the mindset that comes with operating this new offshore grid. This transition process heralds an increase in collaboration across boundaries and integration of energy management across companies, businesses, technical disciplines, and engagement with stakeholders in the larger society. This transition will be described as a function of the new challenges with increased complexity of the energy mix (wind, oil/gas, hydrogen and others) coupled with increased technical and organization complexity in energy management. Organizational complexity denotes an increasing integration across boundaries, whether these boundaries are company, vendors, professional disciplines, regulatory regimes/bodies, businesses, and across numerous societal stakeholders. New practices must be developed, made legitimate and institutionalized across these boundaries. Only parts of this complexity can be mitigated technically, e.g.: by use of batteries, mixing energy systems and simulation/ forecasting tools. Many challenges must be mitigated with legitimated societal and institutionalized governance practices on many levels. Offshore electrification supports Norway’s 2030 climate targets but is also controversial since it is exploiting the larger society’s energy resources. This means that new systems and practices must also be transparent, not only for the industry and the authorities, but must also be acceptable and just for the larger society. The paper report from ongoing work in Norway, participant observation and interviews in projects and people working with offshore grid development in Norway. One case presented is the development of an offshore floating windfarm connected to two offshore installations and the second case is an offshore grid development initiative providing six installations electric energy via an onshore cable. The development of the offshore grid is analyzed using a capability platform framework, that describes the technical, competence, work process and governance capabilities that are under development in Norway. A capability platform is a ‘stack’ with the following layers: intelligent infrastructure, information and collaboration, knowledge sharing & analytics and finally business operations. The need for better collaboration and energy forecasting tools/capabilities in this stack will be given a special attention in the two use cases that are presented.

Keywords: capability platform, electrification, carbon footprint, control rooms, energy forecsting, operational model

Procedia PDF Downloads 67
3937 3D Microscopy, Image Processing, and Analysis of Lymphangiogenesis in Biological Models

Authors: Thomas Louis, Irina Primac, Florent Morfoisse, Tania Durre, Silvia Blacher, Agnes Noel

Abstract:

In vitro and in vivo lymphangiogenesis assays are essential for the identification of potential lymphangiogenic agents and the screening of pharmacological inhibitors. In the present study, we analyse three biological models: in vitro lymphatic endothelial cell spheroids, in vivo ear sponge assay, and in vivo lymph node colonisation by tumour cells. These assays provide suitable 3D models to test pro- and anti-lymphangiogenic factors or drugs. 3D images were acquired by confocal laser scanning and light sheet fluorescence microscopy. Virtual scan microscopy followed by 3D reconstruction by image aligning methods was also used to obtain 3D images of whole large sponge and ganglion samples. 3D reconstruction, image segmentation, skeletonisation, and other image processing algorithms are described. Fixed and time-lapse imaging techniques are used to analyse lymphatic endothelial cell spheroids behaviour. The study of cell spatial distribution in spheroid models enables to detect interactions between cells and to identify invasion hierarchy and guidance patterns. Global measurements such as volume, length, and density of lymphatic vessels are measured in both in vivo models. Branching density and tortuosity evaluation are also proposed to determine structure complexity. Those properties combined with vessel spatial distribution are evaluated in order to determine lymphangiogenesis extent. Lymphatic endothelial cell invasion and lymphangiogenesis were evaluated under various experimental conditions. The comparison of these conditions enables to identify lymphangiogenic agents and to better comprehend their roles in the lymphangiogenesis process. The proposed methodology is validated by its application on the three presented models.

Keywords: 3D image segmentation, 3D image skeletonisation, cell invasion, confocal microscopy, ear sponges, light sheet microscopy, lymph nodes, lymphangiogenesis, spheroids

Procedia PDF Downloads 377
3936 Implication of Built-Up Area, Vegetation, and Motorized Vehicles to Urban Microclimate in Bandung City Center

Authors: Ira Irawati, Muhammad Rangga Sururi

Abstract:

The expansion of built-up areas in many cities, particularly, as the consequences of urbanization process, is a common phenomenon in our contemporary world. As happened in many cities in developing world, this horizontal expansion let only a handful size of the area left for green open spaces, creating an extreme unbalance between built-up and green spaces. Combined with the high density and variety of human activities with its transportation modes; a process of urban heat island will occur, resulting in an increase in air temperature. This is one of the indicators of decreasing of the quality of urban microclimate. This paper will explore the effect of several variables of built-up areas and open spaces to the increase of air temperature using multiple linear regression analysis. We selected 11 zones within the radius of 1 km in Inner Bandung city center, and each zones measured within 300 m radius to represent the variety of land use, as well as the composition of buildings and green open spaces. By using a quantitative method which is multiple linear regression analysis, six dependent variables which are a) tree density-x1, b) shade level of tree-x2, c) surface area of buildings’ side which are facing west and east-x3, d) surface area of building side material-x4, e) surface area of pathway material, and f) numbers of motorized vehicles-x6; are calculated to find those influence to the air temperature as an independent variable-y. Finally, the relationship between those variables shows in this equation: y = 30.316 - 3.689 X1 – 6.563 X2 + 0.002 X3 – 2,517E6 X4 + 1.919E-9 X5 + 1.952E-4 X6. It shows that the existence of vegetation has a great impact on lowering temperature. In another way around, built up the area and motorized vehicles would increase the temperature. However, one component of built up area, the surface area of buildings’ sides which are facing west and east, has different result due to the building material is classified in low-middle heat capacity.

Keywords: built-up area, microclimate, vehicles, urban heat island, vegetation

Procedia PDF Downloads 258
3935 Influence of Structured Capillary-Porous Coatings on Cryogenic Quenching Efficiency

Authors: Irina P. Starodubtseva, Aleksandr N. Pavlenko

Abstract:

Quenching is a term generally accepted for the process of rapid cooling of a solid that is overheated above the thermodynamic limit of the liquid superheat. The main objective of many previous studies on quenching is to find a way to reduce the total time of the transient process. Computational experiments were performed to simulate quenching by a falling liquid nitrogen film of an extremely overheated vertical copper plate with a structured capillary-porous coating. The coating was produced by directed plasma spraying. Due to the complexities in physical pattern of quenching from chaotic processes to phase transition, the mechanism of heat transfer during quenching is still not sufficiently understood. To our best knowledge, no information exists on when and how the first stable liquid-solid contact occurs and how the local contact area begins to expand. Here we have more models and hypotheses than authentically established facts. The peculiarities of the quench front dynamics and heat transfer in the transient process are studied. The created numerical model determines the quench front velocity and the temperature fields in the heater, varying in space and time. The dynamic pattern of the running quench front obtained numerically satisfactorily correlates with the pattern observed in experiments. Capillary-porous coatings with straight and reverse orientation of crests are investigated. The results show that the cooling rate is influenced by thermal properties of the coating as well as the structure and geometry of the protrusions. The presence of capillary-porous coating significantly affects the dynamics of quenching and reduces the total quenching time more than threefold. This effect is due to the fact that the initialization of a quench front on a plate with a capillary-porous coating occurs at a temperature significantly higher than the thermodynamic limit of the liquid superheat, when a stable solid-liquid contact is thermodynamically impossible. Waves present on the liquid-vapor interface and protrusions on the complex micro-structured surface cause destabilization of the vapor film and the appearance of local liquid-solid micro-contacts even though the average integral surface temperature is much higher than the liquid superheat limit. The reliability of the results is confirmed by direct comparison with experimental data on the quench front velocity, the quench front geometry, and the surface temperature change over time. Knowledge of the quench front velocity and total time of transition process is required for solving practically important problems of nuclear reactors safety.

Keywords: capillary-porous coating, heat transfer, Leidenfrost phenomenon, numerical simulation, quenching

Procedia PDF Downloads 130
3934 Structural Reliability Analysis Using Extreme Learning Machine

Authors: Mehul Srivastava, Sharma Tushar Ravikant, Mridul Krishn Mishra

Abstract:

In structural design, the evaluation of safety and probability failure of structure is of significant importance, mainly when the variables are random. On real structures, structural reliability can be evaluated obtaining an implicit limit state function. The structural reliability limit state function is obtained depending upon the statistically independent variables. In the analysis of reliability, we considered the statistically independent random variables to be the load intensity applied and the depth or height of the beam member considered. There are many approaches for structural reliability problems. In this paper Extreme Learning Machine technique and First Order Second Moment Method is used to determine the reliability indices for the same set of variables. The reliability index obtained using ELM is compared with the reliability index obtained using FOSM. Higher the reliability index, more feasible is the method to determine the reliability.

Keywords: reliability, reliability index, statistically independent, extreme learning machine

Procedia PDF Downloads 682
3933 A Thermosensitive Polypeptide Hydrogel for Biomedical Application

Authors: Chih-Chi Cheng, Ji-Yu Lin, I-Ming Chu

Abstract:

In this study, we synthesized a thermosensitive polypeptide hydrogel by copolymerizing poloxamer (PLX) and poly(ʟ-alanine) with ʟ-lysine segments at the both ends to form PLX-b-poly(ʟ-alanine-lysine) (Lys-Ala-PLX-Ala-Lys) copolymers. Poly(ʟ-alanine) is the hydrophobic chain of Lys-Ala-PLX-Ala-Lys copolymers which was designed to capture the hydrophobic agents. The synthesis was examined by 1H NMR and showed that Lys-Ala-PLX-Ala-Lys copolymers were successfully synthesized. At the concentration range of 3-7 wt%, the aqueous copolymer solution underwent sol-gel transition near the physiological temperature and exhibited changes in its secondary structure content, as evidenced by FTIR. The excellent viability of cells cultured within the scaffold was observed after 72 hr of incubation. Also, negatively charged bovine serum albumin was incorporated into the hydrogel without diminishing material integrity and shows good release profile. In the animal study, the results also indicated that Lys-Ala-PLX-Ala-Lys hydrogel has high potential in wound dressing.

Keywords: polypeptide thermosensitive hydrogel, tacrolimus, vascularized composite allotransplantation, sustain release

Procedia PDF Downloads 291
3932 The Role of Disturbed Dry Afromontane Forest of Ethiopia for Biodiversity Conservation and Carbon Storage

Authors: Mindaye Teshome, Nesibu Yahya, Carlos Moreira Miquelino Eleto Torres, Pedro Manuel Villaa, Mehari Alebachew

Abstract:

Arbagugu forest is one of the remnant dry Afromontane forests under severe anthropogenic disturbances in central Ethiopia. Despite this fact, up-to-date information is lacking about the status of the forest and its role in climate change mitigation. In this study, we evaluated the woody species composition, structure, biomass, and carbon stock in this forest. We employed a systematic random sampling design and established fifty-three sample plots (20 × 100 m) to collect the vegetation data. A total of 37 woody species belonging to 25 families were recorded. The density of seedlings, saplings, and matured trees were 1174, 101, and 84 stems ha-1, respectively. The total basal area of trees with DBH (diameter at breast height) ≥ 2 cm was 21.3 m2 ha-1. The characteristic trees of dry Afromontane Forest such as Podocarpus falcatus, Juniperus procera, and Olea europaea subsp. cuspidata exhibited a fair regeneration status. On the contrary, the least abundant species Lepidotrichilia volkensii, Canthium oligocarpum, Dovyalis verrucosa, Calpurnia aurea, and Maesa lanceolata exhibited good regeneration status. Some tree species such as Polyscias fulva, Schefflera abyssinica, Erythrina brucei, and Apodytes dimidiata lack regeneration. The total carbon stored in the forest ranged between 6.3 Mg C ha-1 and 835.6 Mg C ha-1. This value is equivalent to 639.6 Mg C ha-1. The forest had a very low number of woody species composition and diversity. The regeneration study also revealed that a significant number of tree species had unsatisfactory regeneration status. Besides, the forest had a lower carbon stock density compared with other dry Afromontane forests. This implies the urgent need for forest conservation and restoration activities by the local government, conservation practitioners, and other concerned bodies to maintain the forest and sustain the various ecosystem goods and services provided by the Arbagugu forest.

Keywords: aboveground biomass, forest regeneration, climate change, biodiversity conservation, restoration

Procedia PDF Downloads 110
3931 Transport of Inertial Finite-Size Floating Plastic Pollution by Ocean Surface Waves

Authors: Ross Calvert, Colin Whittaker, Alison Raby, Alistair G. L. Borthwick, Ton S. van den Bremer

Abstract:

Large concentrations of plastic have polluted the seas in the last half century, with harmful effects on marine wildlife and potentially to human health. Plastic pollution will have lasting effects because it is expected to take hundreds or thousands of years for plastic to decay in the ocean. The question arises how waves transport plastic in the ocean. The predominant motion induced by waves creates ellipsoid orbits. However, these orbits do not close, resulting in a drift. This is defined as Stokes drift. If a particle is infinitesimally small and the same density as water, it will behave exactly as the water does, i.e., as a purely Lagrangian tracer. However, as the particle grows in size or changes density, it will behave differently. The particle will then have its own inertia, the fluid will exert drag on the particle, because there is relative velocity, and it will rise or sink depending on the density and whether it is on the free surface. Previously, plastic pollution has all been considered to be purely Lagrangian. However, the steepness of waves in the ocean is small, normally about α = k₀a = 0.1 (where k₀ is the wavenumber and a is the wave amplitude), this means that the mean drift flows are of the order of ten times smaller than the oscillatory velocities (Stokes drift is proportional to steepness squared, whilst the oscillatory velocities are proportional to the steepness). Thus, the particle motion must have the forces of the full motion, oscillatory and mean flow, as well as a dynamic buoyancy term to account for the free surface, to determine whether inertia is important. To track the motion of a floating inertial particle under wave action requires the fluid velocities, which form the forcing, and the full equations of motion of a particle to be solved. Starting with the equation of motion of a sphere in unsteady flow with viscous drag. Terms can added then be added to the equation of motion to better model floating plastic: a dynamic buoyancy to model a particle floating on the free surface, quadratic drag for larger particles and a slope sliding term. Using perturbation methods to order the equation of motion into sequentially solvable parts allows a parametric equation for the transport of inertial finite-sized floating particles to be derived. This parametric equation can then be validated using numerical simulations of the equation of motion and flume experiments. This paper presents a parametric equation for the transport of inertial floating finite-size particles by ocean waves. The equation shows an increase in Stokes drift for larger, less dense particles. The equation has been validated using numerical solutions of the equation of motion and laboratory flume experiments. The difference in the particle transport equation and a purely Lagrangian tracer is illustrated using worlds maps of the induced transport. This parametric transport equation would allow ocean-scale numerical models to include inertial effects of floating plastic when predicting or tracing the transport of pollutants.

Keywords: perturbation methods, plastic pollution transport, Stokes drift, wave flume experiments, wave-induced mean flow

Procedia PDF Downloads 121
3930 Effect of the Mould Rotational Speed on the Quality of Centrifugal Castings

Authors: M. A. El-Sayed, S. A. Aziz

Abstract:

Centrifugal casting is a standard casting technique for the manufacture of hollow, intricate and sound castings without the use of cores. The molten metal or alloy poured into the rotating mold forms a hollow casting as the centrifugal forces lift the liquid along the mold inner surface. The rotational speed of the die was suggested to greatly affect the manner in which the molten metal flows within the mould and consequently the probability of the formation of a uniform cylinder. In this work the flow of the liquid metal at various speeds and its effect during casting were studied. The results suggested that there was a critical range for the speed, within which the produced castings exhibited best uniformity and maximum mechanical properties. When a mould was rotated at speeds below or beyond the critical range defects were found in the final castings, which affected the uniformity and significantly lowered the mechanical properties.

Keywords: centrifugal casting, rotational speed, critical speed range, mechanical properties

Procedia PDF Downloads 445
3929 Backward-Facing Step Measurements at Different Reynolds Numbers Using Acoustic Doppler Velocimetry

Authors: Maria Amelia V. C. Araujo, Billy J. Araujo, Brian Greenwood

Abstract:

The flow over a backward-facing step is characterized by the presence of flow separation, recirculation and reattachment, for a simple geometry. This type of fluid behaviour takes place in many practical engineering applications, hence the reason for being investigated. Historically, fluid flows over a backward-facing step have been examined in many experiments using a variety of measuring techniques such as laser Doppler velocimetry (LDV), hot-wire anemometry, particle image velocimetry or hot-film sensors. However, some of these techniques cannot conveniently be used in separated flows or are too complicated and expensive. In this work, the applicability of the acoustic Doppler velocimetry (ADV) technique is investigated to such type of flows, at various Reynolds numbers corresponding to different flow regimes. The use of this measuring technique in separated flows is very difficult to find in literature. Besides, most of the situations where the Reynolds number effect is evaluated in separated flows are in numerical modelling. The ADV technique has the advantage in providing nearly non-invasive measurements, which is important in resolving turbulence. The ADV Nortek Vectrino+ was used to characterize the flow, in a recirculating laboratory flume, at various Reynolds Numbers (Reh = 3738, 5452, 7908 and 17388) based on the step height (h), in order to capture different flow regimes, and the results compared to those obtained using other measuring techniques. To compare results with other researchers, the step height, expansion ratio and the positions upstream and downstream the step were reproduced. The post-processing of the AVD records was performed using a customized numerical code, which implements several filtering techniques. Subsequently, the Vectrino noise level was evaluated by computing the power spectral density for the stream-wise horizontal velocity component. The normalized mean stream-wise velocity profiles, skin-friction coefficients and reattachment lengths were obtained for each Reh. Turbulent kinetic energy, Reynolds shear stresses and normal Reynolds stresses were determined for Reh = 7908. An uncertainty analysis was carried out, for the measured variables, using the moving block bootstrap technique. Low noise levels were obtained after implementing the post-processing techniques, showing their effectiveness. Besides, the errors obtained in the uncertainty analysis were relatively low, in general. For Reh = 7908, the normalized mean stream-wise velocity and turbulence profiles were compared directly with those acquired by other researchers using the LDV technique and a good agreement was found. The ADV technique proved to be able to characterize the flow properly over a backward-facing step, although additional caution should be taken for measurements very close to the bottom. The ADV measurements showed reliable results regarding: a) the stream-wise velocity profiles; b) the turbulent shear stress; c) the reattachment length; d) the identification of the transition from transitional to turbulent flows. Despite being a relatively inexpensive technique, acoustic Doppler velocimetry can be used with confidence in separated flows and thus very useful for numerical model validation. However, it is very important to perform adequate post-processing of the acquired data, to obtain low noise levels, thus decreasing the uncertainty.

Keywords: ADV, experimental data, multiple Reynolds number, post-processing

Procedia PDF Downloads 147
3928 Institutionalizing Peace in Iraqi Kurdistan Post-civil War, 1998 to Present

Authors: Hawre Hasan Hama, Choman Mahmood H. Rashid

Abstract:

The four-year armed conflict between the Kurdistan Democratic Party (KDP) and the Patriotic Union of Kurdistan (PUK) ended in September 1998 under the terms of the Washington Agreement. Since then, there has been a quarter-century of durable peace between the two combatant parties, though they have often been at odds politically. Based on interviews with Kurdish political leaders from both parties, this paper argues that sharing or dividing power across all four dimensions of state power — political, military, territorial, and economic — has played a vital role ensuring the durability of the peace settlement. The paper traces the KDP-PUK power sharing system through three stages: the transition stage (1998-2006), the “golden” period (2006-2013), the “weakening” period (2013 to present).

Keywords: peace settlement, enduring peace, power-sharing and power dividing, Iraqi Kurdistan.

Procedia PDF Downloads 92
3927 Effect of Hydroxyl Functionalization on the Mechanical and Fracture Behaviour of Monolayer Graphene

Authors: Akarsh Verma, Avinash Parashar

Abstract:

The aim of this article is to study the effects of hydroxyl functional group on the mechanical strength and fracture toughness of graphene. This functional group forms the backbone of intrinsic atomic structure of graphene oxide (GO). Molecular dynamics-based simulations were performed in conjunction with reactive force field (ReaxFF) parameters to capture the mode-I fracture toughness of hydroxyl functionalised graphene. Moreover, these simulations helped in concluding that spatial distribution and concentration of hydroxyl functional group significantly affects the fracture morphology of graphene nanosheet. In contrast to literature investigations, atomistic simulations predicted a transition in the failure morphology of hydroxyl functionalised graphene from brittle to ductile as a function of its spatial distribution on graphene sheet.

Keywords: graphene, graphene oxide, ReaxFF, molecular dynamics

Procedia PDF Downloads 179
3926 The Effect of Sumatra Fault Earthquakes on West Malaysia

Authors: Noushin Naraghi Araghi, M. Nawawi, Syed Mustafizur Rahman

Abstract:

This paper presents the effect of Sumatra fault earthquakes on west Malaysia by calculating the peak horizontal ground acceleration (PGA). PGA is calculated by a probabilistic seismic hazard assessment (PSHA). A uniform catalog of earthquakes for the interest region has been provided. We used empirical relations to convert all magnitudes to Moment Magnitude. After eliminating foreshocks and aftershocks in order to achieve more reliable results, the completeness of the catalog and uncertainty of magnitudes have been estimated and seismicity parameters were calculated. Our seismic source model considers the Sumatran strike slip fault that is known historically to generate large earthquakes. The calculations were done using the logic tree method and four attenuation relationships and slip rates for different part of this fault. Seismic hazard assessment carried out for 48 grid points. Eventually, two seismic hazard maps based PGA for 5% and 10% probability of exceedance in 50 year are presented.

Keywords: Sumatra fault, west Malaysia, PGA, seismic parameters

Procedia PDF Downloads 404
3925 A Systematic Review of Situational Awareness and Cognitive Load Measurement in Driving

Authors: Aly Elshafei, Daniela Romano

Abstract:

With the development of autonomous vehicles, a human-machine interaction (HMI) system is needed for a safe transition of control when a takeover request (TOR) is required. An important part of the HMI system is the ability to monitor the level of situational awareness (SA) of any driver in real-time, in different scenarios, and without any pre-calibration. Presenting state-of-the-art machine learning models used to measure SA is the purpose of this systematic review. Investigating the limitations of each type of sensor, the gaps, and the most suited sensor and computational model that can be used in driving applications. To the author’s best knowledge this is the first literature review identifying online and offline classification methods used to measure SA, explaining which measurements are subject or session-specific, and how many classifications can be done with each classification model. This information can be very useful for researchers measuring SA to identify the most suited model to measure SA for different applications.

Keywords: situational awareness, autonomous driving, gaze metrics, EEG, ECG

Procedia PDF Downloads 119
3924 Fault Tree Analysis and Bayesian Network for Fire and Explosion of Crude Oil Tanks: Case Study

Authors: B. Zerouali, M. Kara, B. Hamaidi, H. Mahdjoub, S. Rouabhia

Abstract:

In this paper, a safety analysis for crude oil tanks to prevent undesirable events that may cause catastrophic accidents. The estimation of the probability of damage to industrial systems is carried out through a series of steps, and in accordance with a specific methodology. In this context, this work involves developing an assessment tool and risk analysis at the level of crude oil tanks system, based primarily on identification of various potential causes of crude oil tanks fire and explosion by the use of Fault Tree Analysis (FTA), then improved risk modelling by Bayesian Networks (BNs). Bayesian approach in the evaluation of failure and quantification of risks is a dynamic analysis approach. For this reason, have been selected as an analytical tool in this study. Research concludes that the Bayesian networks have a distinct and effective method in the safety analysis because of the flexibility of its structure; it is suitable for a wide variety of accident scenarios.

Keywords: bayesian networks, crude oil tank, fault tree, prediction, safety

Procedia PDF Downloads 660
3923 Importance of Prostate Volume, Prostate Specific Antigen Density and Free/Total Prostate Specific Antigen Ratio for Prediction of Prostate Cancer

Authors: Aliseydi Bozkurt

Abstract:

Objectives: Benign prostatic hyperplasia (BPH) is the most common benign disease, and prostate cancer (PC) is malign disease of the prostate gland. Transrectal ultrasound-guided biopsy (TRUS-bx) is one of the most important diagnostic tools in PC diagnosis. Identifying men at increased risk for having a biopsy detectable prostate cancer should consider prostate specific antigen density (PSAD), f/t PSA Ratio, an estimate of prostate volume. Method: We retrospectively studied 269 patients who had a prostate specific antigen (PSA) score of 4 or who had suspected rectal examination at any PSA level and received TRUS-bx between January 2015 and June 2018 in our clinic. TRUS-bx was received by 12 experienced urologists with 12 quadrants. Prostate volume was calculated prior to biopsy together with TRUS. Patients were classified as malignant and benign at the end of pathology. Age, PSA value, prostate volume in transrectal ultrasonography, corpuscle biopsy, biopsy pathology result, the number of cancer core and Gleason score were evaluated in the study. The success rates of PV, PSAD, and f/tPSA were compared in all patients and those with PSA 2.5-10 ng/mL and 10.1-30 ng/mL tp foresee prostate cancer. Result: In the present study, in patients with PSA 2.5-10 ng/ml, PV cut-off value was 43,5 mL (n=42 < 43,5 mL and n=102 > 43,5 mL) while in those with PSA 10.1-30 ng/mL prostate volüme (PV) cut-off value was found 61,5 mL (n=31 < 61,5 mL and n=36 > 61,5 mL). Total PSA values in the group with PSA 2.5-10 ng/ml were found lower (6.0 ± 1.3 vs 6.7 ± 1.7) than that with PV < 43,5 mL, this value was nearly significant (p=0,043). In the group with PSA value 10.1-30 ng/mL, no significant difference was found (p=0,117) in terms of total PSA values between the group with PV < 61,5 mL and that with PV > 61,5 mL. In the group with PSA 2.5-10 ng/ml, in patients with PV < 43,5 mL, f/t PSA value was found significantly lower compared to the group with PV > 43,5 mL (0.21 ± 0.09 vs 0.26 ± 0.09 p < 0.001 ). Similarly, in the group with PSA value of 10.1-30 ng/mL, f/t PSA value was found significantly lower in patients with PV < 61,5 mL (0.16 ± 0.08 vs 0.23 ± 0.10 p=0,003). In the group with PSA 2.5-10 ng/ml, PSAD value in patients with PV < 43,5 mL was found significantly higher compared to those with PV > 43,5 mL (0.17 ± 0.06 vs 0.10 ± 0.03 p < 0.001). Similarly, in the group with PSA value 10.1-30 ng/mL PSAD value was found significantly higher in patients with PV < 61,5 mL (0.47 ± 0.23 vs 0.17 ± 0.08 p < 0.001 ). The biopsy results suggest that in the group with PSA 2.5-10 ng/ml, in 29 of the patients with PV < 43,5 mL (69%) cancer was detected while in 13 patients (31%) no cancer was detected. While in 19 patients with PV > 43,5 mL (18,6%) cancer was found, in 83 patients (81,4%) no cancer was detected (p < 0.001). In the group with PSA value 10.1-30 ng/mL, in 21 patients with PV < 61,5 mL (67.7%) cancer was observed while only in10 patients (32.3%) no cancer was seen. In 5 patients with PV > 61,5 mL (13.9%) cancer was found while in 31 patients (86.1%) no cancer was observed (p < 0.001). Conclusions: Identifying men at increased risk for having a biopsy detectable prostate cancer should consider PSA, f/t PSA Ratio, an estimate of prostate volume. Prostate volume in PC was found lower.

Keywords: prostate cancer, prostate volume, prostate specific antigen, free/total PSA ratio

Procedia PDF Downloads 149
3922 Controlling Excitons Complexes in Two Dimensional MoS₂ Monolayers

Authors: Arslan Usman, Abdul Sattar, Hamid Latif, Afshan Ashfaq, Muhammad Rafique, Martin Koch

Abstract:

Two-dimensional materials have promising applications in optoelectronic and photonics; MoS₂ is the pioneer 2D material in the family of transition metal dichalcogenides. Its optical, optoelectronic, and structural properties are of practical importance along with its exciton dynamics. Exciton, along with exciton complexes, plays a vital role in realizing quantum devices. MoS₂ monolayers were synthesized using chemical vapour deposition (CVD) technique on SiO₂ and hBN substrates. Photoluminescence spectroscopy (PL) was used to identify the monolayer, which also reflects the substrate based peak broadening due to screening effects. In-plane and out of plane characteristic vibrational modes E¹₂g and A₁g, respectively, were detected in a different configuration on the substrate. The B-excitons and trions showed a dominant feature at low temperatures due to electron-phonon coupling effects, whereas their energies are separated by 100 meV.

Keywords: 2D materials, photoluminescence, AFM, excitons

Procedia PDF Downloads 144
3921 Consumer Attitude and Purchase Intention towards Organic Food: Insights from Pakistan

Authors: Muneshia Maheshwar, Kanwal Gul, Shakira Fareed, Ume-Amama Areeb Gul

Abstract:

Organic food is commonly known for its healthier content without the use of pesticides, herbicides, inorganic fertilizers, antibiotics and growth hormones. The aim of this research is to examine the effect of health consciousness, environmental concern and organic food knowledge on both the intention to buy organic foods and the attitude towards organic foods and the effect of attitude towards organic foods on the intention to buy organic foods in Pakistan. Primary data was used which was collected through adopted questionnaire from previous research. Non- probability convenience sampling was used to select sample size of 200 consumers based on Karachi. The data was analyzed through Descriptive statistics and Multi regression method. The findings of the study showed that the attitude and the intention to buy organic food were affected by health consciousness, environmental concern, and organic food knowledge. The results also revealed that attitude also affects the intention to buy organic food.

Keywords: health consciousness, attitude, intention to purchase, environmental concern, organic food knowledge

Procedia PDF Downloads 247