Search results for: interpolation scheme
418 Finite Volume Method for Flow Prediction Using Unstructured Meshes
Authors: Juhee Lee, Yongjun Lee
Abstract:
In designing a low-energy-consuming buildings, the heat transfer through a large glass or wall becomes critical. Multiple layers of the window glasses and walls are employed for the high insulation. The gravity driven air flow between window glasses or wall layers is a natural heat convection phenomenon being a key of the heat transfer. For the first step of the natural heat transfer analysis, in this study the development and application of a finite volume method for the numerical computation of viscous incompressible flows is presented. It will become a part of the natural convection analysis with high-order scheme, multi-grid method, and dual-time step in the future. A finite volume method based on a fully-implicit second-order is used to discretize and solve the fluid flow on unstructured grids composed of arbitrary-shaped cells. The integrations of the governing equation are discretised in the finite volume manner using a collocated arrangement of variables. The convergence of the SIMPLE segregated algorithm for the solution of the coupled nonlinear algebraic equations is accelerated by using a sparse matrix solver such as BiCGSTAB. The method used in the present study is verified by applying it to some flows for which either the numerical solution is known or the solution can be obtained using another numerical technique available in the other researches. The accuracy of the method is assessed through the grid refinement.Keywords: finite volume method, fluid flow, laminar flow, unstructured grid
Procedia PDF Downloads 286417 Safe School Program in Indonesia: Questioning Whether It Is Too Hard to Succeed
Authors: Ida Ngurah
Abstract:
Indonesia is one of the most prone disaster countries, which has earthquake, tsunami or high wave, flood and landslide as well as volcano eruption and drought. Disaster risk reduction has been developing extensively and comprehensively, particularly after tsunami hit in 2004. Yet, saving people live including children and youth from disaster risk is still far from succeed. Poor management of environment, poor development of policy and high level of corruption has become challenges for Indonesia to save its people from disaster impact. Indonesia is struggling to ensure its future best investment, children and youth to have better protection when disaster strike in school hours and have basic knowledge on disaster risk reduction. The program of safe school is being initiated and developed by Plan Indonesia since 2010, yet this effort still needs to be elaborated. This paper is reviewing sporadic safe school programs that have been implemented or currently being implemented Plan Indonesia in few areas of Indonesia, including both rural and urban setting. Methods used are in-depth interview with dedicated person for the program from Plan Indonesia and its implementing patners and analysis of project documents. The review includes program’s goal and objectives, implementation activity, result and achievement as well as its monitoring and evaluation scheme. Moreover, paper will be showing challenges, lesson learned and best practices of the program. Eventually, paper will come up with recommendation for strategy for better implementation of safe school program in Indonesia.Keywords: disaster impact, safe school, programs, children, youth
Procedia PDF Downloads 367416 Heat Transfer Process Parameter Optimization in SI/Ge Using TAGUCHI Method
Authors: Evln Ranga Charyulu, S. P. Venu Madhavarao, S. Udaya kumar, S. V. S. S. N. V. G. Krishna Murthy
Abstract:
With the advent of new nanometer process technologies, it is possible to integrate billion transistors on a single substrate. When more and more functionality included there is the possibility of multi-million transistors switching simultaneously consuming more power and dissipating more power along with more leakage of current into the substrate of porous silicon or germanium material. These results in substrate heating and thermal noise generation coupled to signals of interest. The heating process is represented by coupled nonlinear partial differential equations in porous silicon and germanium. By identifying heat sources and heat fluxes may results in designing of ultra-low power circuits. The PDEs are solved by finite difference scheme assuming that boundary layer equations in porous silicon and germanium. Local heat fluxes along the vertical isothermal surface immersed in porous SI/Ge are considered. The parameters considered for optimization are thermal diffusivity, thermal expansion coefficient, thermal diffusion ratio, permeability, specific heat at constant temperatures, Rayleigh number, amplitude of wavy surface, mass expansion coefficient. The diffusion of heat was caused by the concentration gradient. Thermal physical properties are homogeneous and isotropic. By using L8, TAGUCHI method the parameters are optimized.Keywords: heat transfer, pde, taguchi optimization, SI/Ge
Procedia PDF Downloads 339415 Passive and Active Spatial Pendulum Tuned Mass Damper with Two Tuning Frequencies
Authors: W. T. A. Mohammed, M. Eltaeb, R. Kashani
Abstract:
The first bending modes of tall asymmetric structures in the two lateral X and Y-directions have two different natural frequencies. To add tuned damping to these bending modes, one needs to either a) use two pendulum-tuned mass dampers (PTMDs) with one tuning frequency, each PTMD targeting one of the bending modes, or b) use one PTMD with two tuning frequencies (one in each lateral directions). Option (a), being more massive, requiring more space, and being more expensive, is less attractive than option (b). Considering that the tuning frequency of a pendulum depends mainly on the pendulum length, one way of realizing option (b) is by constraining the swinging length of the pendulum in one direction but not in the other; such PTMD is dubbed passive Bi-PTMD. Alternatively, option (b) can be realized by actively setting the tuning frequencies of the PTMD in the two directions. In this work, accurate physical models of passive Bi-PTMD and active PTMD are developed and incorporated into the numerical model of a tall asymmetric structure. The model of PTMDs plus structure is used for a)synthesizing such PTMDs for particular applications and b)evaluating their damping effectiveness in mitigating the dynamic lateral responses of their target asymmetric structures, perturbed by wind load in X and Y-directions. Depending on how elaborate the control scheme is, the active PTMD can either be made to yield the same damping effectiveness as the passive Bi-PTMD of the same size or the passive Bi-TMD twice as massive as the active PTMD.Keywords: active tuned mass damper, high-rise building, multi-frequency tuning, vibration control
Procedia PDF Downloads 105414 Computationally Efficient Stacking Sequence Blending for Composite Structures with a Large Number of Design Regions Using Cellular Automata
Authors: Ellen Van Den Oord, Julien Marie Jan Ferdinand Van Campen
Abstract:
This article introduces a computationally efficient method for stacking sequence blending of composite structures. The computational efficiency makes the presented method especially interesting for composite structures with a large number of design regions. Optimization of composite structures with an unequal load distribution may lead to locally optimized thicknesses and ply orientations that are incompatible with one another. Blending constraints can be enforced to achieve structural continuity. In literature, many methods can be found to implement structural continuity by means of stacking sequence blending in one way or another. The complexity of the problem makes the blending of a structure with a large number of adjacent design regions, and thus stacking sequences, prohibitive. In this work the local stacking sequence optimization is preconditioned using a method found in the literature that couples the mechanical behavior of the laminate, in the form of lamination parameters, to blending constraints, yielding near-optimal easy-to-blend designs. The preconditioned design is then fed to the scheme using cellular automata that have been developed by the authors. The method is applied to the benchmark 18-panel horseshoe blending problem to demonstrate its performance. The computational efficiency of the proposed method makes it especially suited for composite structures with a large number of design regions.Keywords: composite, blending, optimization, lamination parameters
Procedia PDF Downloads 228413 Development of Membrane Reactor for Auto Thermal Reforming of Dimethyl Ether for Hydrogen Production
Authors: Tie-Qing Zhang, Seunghun Jung, Young-Bae Kim
Abstract:
This research is devoted to developing a membrane reactor to flexibly meet the hydrogen demand of onboard fuel cells, which is an important part of green energy development. Among many renewable chemical products, dimethyl ether (DME) has the advantages of low reaction temperature (400 °C in this study), high hydrogen atom content, low toxicity, and easy preparation. Autothermal reforming, on the other hand, has a high hydrogen recovery rate and exhibits thermal neutrality during the reaction process, so the additional heat source in the hydrogen production process can be omitted. Therefore, the DME auto thermal reforming process was adopted in this study. To control the temperature of the reaction catalyst bed and hydrogen production rate, a Model Predictive Control (MPC) scheme was designed. Taking the above two variables as the control objectives, stable operation of the reformer can be achieved by controlling the flow rates of DME, steam, and high-purity air in real-time. To prevent catalyst poisoning in the fuel cell, the hydrogen needs to be purified to reduce the carbon monoxide content to below 50 ppm. Therefore, a Pd-Ag hydrogen semi-permeable membrane with a thickness of 3-5 μm was inserted into the auto thermal reactor, and the permeation efficiency of hydrogen was improved by steam purging on the permeation side. Finally, hydrogen with a purity of 99.99 was obtained.Keywords: hydrogen production, auto thermal reforming, membrane, fuel cell
Procedia PDF Downloads 105412 Opacity Synthesis with Orwellian Observers
Authors: Moez Yeddes
Abstract:
The property of opacity is widely used in the formal verification of security in computer systems and protocols. Opacity is a general language-theoretic scheme of many security properties of systems. Opacity is parametrized with framework in which several security properties of a system can be expressed. A secret behaviour of a system is opaque if a passive attacker can never deduce its occurrence from the system observation. Instead of considering the case of static observability where the set of observable events is fixed off-line or dynamic observability where the set of observable events changes over time depending on the history of the trace, we introduce Orwellian partial observability where unobservable events are not revealed provided that downgrading events never occurs in the future of the trace. Orwellian partial observability is needed to model intransitive information flow. This Orwellian observability is knwon as ipurge function. We show in previous work how to verify opacity for regular secret is opaque for a regular language L w.r.t. an Orwellian projection is PSPACE-complete while it has been proved undecidable even for a regular language L w.r.t. a general Orwellian observation function. In this paper, we address two problems of opacification of a regular secret ϕ for a regular language L w.r.t. an Orwellian projection: Given L and a secret ϕ ∈ L, the first problem consist to compute some minimal regular super-language M of L, if it exists, such that ϕ is opaque for M and the second consists to compute the supremal sub-language M′ of L such that ϕ is opaque for M′. We derive both language-theoretic characterizations and algorithms to solve these two dual problems.Keywords: security policies, opacity, formal verification, orwellian observation
Procedia PDF Downloads 225411 Grey Wolf Optimization Technique for Predictive Analysis of Products in E-Commerce: An Adaptive Approach
Authors: Shital Suresh Borse, Vijayalaxmi Kadroli
Abstract:
E-commerce industries nowadays implement the latest AI, ML Techniques to improve their own performance and prediction accuracy. This helps to gain a huge profit from the online market. Ant Colony Optimization, Genetic algorithm, Particle Swarm Optimization, Neural Network & GWO help many e-commerce industries for up-gradation of their predictive performance. These algorithms are providing optimum results in various applications, such as stock price prediction, prediction of drug-target interaction & user ratings of similar products in e-commerce sites, etc. In this study, customer reviews will play an important role in prediction analysis. People showing much interest in buying a lot of services& products suggested by other customers. This ultimately increases net profit. In this work, a convolution neural network (CNN) is proposed which further is useful to optimize the prediction accuracy of an e-commerce website. This method shows that CNN is used to optimize hyperparameters of GWO algorithm using an appropriate coding scheme. Accurate model results are verified by comparing them to PSO results whose hyperparameters have been optimized by CNN in Amazon's customer review dataset. Here, experimental outcome proves that this proposed system using the GWO algorithm achieves superior execution in terms of accuracy, precision, recovery, etc. in prediction analysis compared to the existing systems.Keywords: prediction analysis, e-commerce, machine learning, grey wolf optimization, particle swarm optimization, CNN
Procedia PDF Downloads 113410 Fault Analysis of Ship Power System Comprising of Parallel Generators and Variable Frequency Drive
Authors: Umair Ashraf, Kjetil Uhlen, Sverre Eriksen, Nadeem Jelani
Abstract:
Although advancement in technology has increased the reliability and ease of work in ship power system, but these advancements are also adding complexities. Ever increasing non linear loads, like power electronics (PE) devices effect the stability of the system. Frequent load variations and complex load dynamics are due to the frequency converters and motor drives, these problem are more prominent when system is connected with the weak grid. In the ship power system major consumers are thruster motors for the propulsion. For the control operation of these motors variable frequency drives (VFD) are used, mostly VFDs operate on nominal voltage of the system. Some of the consumers in ship operate on lower voltage than nominal, these consumers got supply through step down transformers. In this paper the vector control scheme is used for the control of both rectifier and inverter, parallel operation of the synchronous generators is also demonstrated. The simulation have been performed with induction motor as load on VFD and parallel RLC load. Fault analysis has been performed first for the system which do not have VFD and then for the system with VFD. Three phase to the ground, single phase to the ground fault were implemented and behavior of the system in both the cases was observed.Keywords: non-linear load, power electronics, parallel operating generators, pulse width modulation, variable frequency drives, voltage source converters, weak grid
Procedia PDF Downloads 569409 Using Authentic and Instructional Materials to Support Intercultural Communicative Competence in ELT
Authors: Jana Beresova
Abstract:
The paper presents a study carried out in 2015-2016 within the national scheme of research - VEGA 1/0106/15 based on theoretical research and empirical verification of the concept of intercultural communicative competence. It focuses on the current conception concerning target languages teaching compatible with the Common European Framework of Reference for Languages: Learning, teaching, assessment. Our research had revealed how the concept of intercultural communicative competence had been perceived by secondary-school teachers of English in Slovakia before they were intensively trained. Intensive workshops were based on the use of both authentic and instructional materials with the goal to support interculturally oriented language teaching aimed at challenging thinking. The former concept that supported the development of the students´ linguistic knowledge and the use of a target language to obtain information about the culture of the country whose language learners were learning was expanded by the meaning-making framework which views language as a typical means by which culture is mediated. The goal of the workshop was to influence English teachers to better understand the concept of intercultural communicative competence, combining theory and practice optimally. The results of the study will be presented and analysed, providing particular recommendations for language teachers and suggesting some changes in the National Educational Programme from which English learners should benefit in their future studies or professional careers.Keywords: authentic materials, English language teaching, instructional materials, intercultural communicative competence
Procedia PDF Downloads 270408 Hybrid Temporal Correlation Based on Gaussian Mixture Model Framework for View Synthesis
Authors: Deng Zengming, Wang Mingjiang
Abstract:
As 3D video is explored as a hot research topic in the last few decades, free-viewpoint TV (FTV) is no doubt a promising field for its better visual experience and incomparable interactivity. View synthesis is obviously a crucial technology for FTV; it enables to render images in unlimited numbers of virtual viewpoints with the information from limited numbers of reference view. In this paper, a novel hybrid synthesis framework is proposed and blending priority is explored. In contrast to the commonly used View Synthesis Reference Software (VSRS), the presented synthesis process is driven in consideration of the temporal correlation of image sequences. The temporal correlations will be exploited to produce fine synthesis results even near the foreground boundaries. As for the blending priority, this scheme proposed that one of the two reference views is selected to be the main reference view based on the distance between the reference views and virtual view, another view is chosen as the auxiliary viewpoint, just assist to fill the hole pixel with the help of background information. Significant improvement of the proposed approach over the state-of –the-art pixel-based virtual view synthesis method is presented, the results of the experiments show that subjective gains can be observed, and objective PSNR average gains range from 0.5 to 1.3 dB, while SSIM average gains range from 0.01 to 0.05.Keywords: fusion method, Gaussian mixture model, hybrid framework, view synthesis
Procedia PDF Downloads 250407 Boundary Layer Flow of a Casson Nanofluid Past a Vertical Exponentially Stretching Cylinder in the Presence of a Transverse Magnetic Field with Internal Heat Generation/Absorption
Authors: G. Sarojamma, K. Vendabai
Abstract:
An analysis is carried out to investigate the effect of magnetic field and heat source on the steady boundary layer flow and heat transfer of a Casson nanofluid over a vertical cylinder stretching exponentially along its radial direction. Using a similarity transformation, the governing mathematical equations, with the boundary conditions are reduced to a system of coupled, non –linear ordinary differential equations. The resulting system is solved numerically by the fourth order Runge – Kutta scheme with shooting technique. The influence of various physical parameters such as Reynolds number, Prandtl number, magnetic field, Brownian motion parameter, thermophoresis parameter, Lewis number and the natural convection parameter are presented graphically and discussed for non – dimensional velocity, temperature and nanoparticle volume fraction. Numerical data for the skin – friction coefficient, local Nusselt number and the local Sherwood number have been tabulated for various parametric conditions. It is found that the local Nusselt number is a decreasing function of Brownian motion parameter Nb and the thermophoresis parameter Nt.Keywords: casson nanofluid, boundary layer flow, internal heat generation/absorption, exponentially stretching cylinder, heat transfer, brownian motion, thermophoresis
Procedia PDF Downloads 389406 Deep Learning-Based Classification of 3D CT Scans with Real Clinical Data; Impact of Image format
Authors: Maryam Fallahpoor, Biswajeet Pradhan
Abstract:
Background: Artificial intelligence (AI) serves as a valuable tool in mitigating the scarcity of human resources required for the evaluation and categorization of vast quantities of medical imaging data. When AI operates with optimal precision, it minimizes the demand for human interpretations and, thereby, reduces the burden on radiologists. Among various AI approaches, deep learning (DL) stands out as it obviates the need for feature extraction, a process that can impede classification, especially with intricate datasets. The advent of DL models has ushered in a new era in medical imaging, particularly in the context of COVID-19 detection. Traditional 2D imaging techniques exhibit limitations when applied to volumetric data, such as Computed Tomography (CT) scans. Medical images predominantly exist in one of two formats: neuroimaging informatics technology initiative (NIfTI) and digital imaging and communications in medicine (DICOM). Purpose: This study aims to employ DL for the classification of COVID-19-infected pulmonary patients and normal cases based on 3D CT scans while investigating the impact of image format. Material and Methods: The dataset used for model training and testing consisted of 1245 patients from IranMehr Hospital. All scans shared a matrix size of 512 × 512, although they exhibited varying slice numbers. Consequently, after loading the DICOM CT scans, image resampling and interpolation were performed to standardize the slice count. All images underwent cropping and resampling, resulting in uniform dimensions of 128 × 128 × 60. Resolution uniformity was achieved through resampling to 1 mm × 1 mm × 1 mm, and image intensities were confined to the range of (−1000, 400) Hounsfield units (HU). For classification purposes, positive pulmonary COVID-19 involvement was designated as 1, while normal images were assigned a value of 0. Subsequently, a U-net-based lung segmentation module was applied to obtain 3D segmented lung regions. The pre-processing stage included normalization, zero-centering, and shuffling. Four distinct 3D CNN models (ResNet152, ResNet50, DensNet169, and DensNet201) were employed in this study. Results: The findings revealed that the segmentation technique yielded superior results for DICOM images, which could be attributed to the potential loss of information during the conversion of original DICOM images to NIFTI format. Notably, ResNet152 and ResNet50 exhibited the highest accuracy at 90.0%, and the same models achieved the best F1 score at 87%. ResNet152 also secured the highest Area under the Curve (AUC) at 0.932. Regarding sensitivity and specificity, DensNet201 achieved the highest values at 93% and 96%, respectively. Conclusion: This study underscores the capacity of deep learning to classify COVID-19 pulmonary involvement using real 3D hospital data. The results underscore the significance of employing DICOM format 3D CT images alongside appropriate pre-processing techniques when training DL models for COVID-19 detection. This approach enhances the accuracy and reliability of diagnostic systems for COVID-19 detection.Keywords: deep learning, COVID-19 detection, NIFTI format, DICOM format
Procedia PDF Downloads 88405 A Postcolonial View Analysis on the Structural Rationalism Influence in Indonesian Modern Architecture
Authors: Ryadi Adityavarman
Abstract:
The study is an analysis by using the postcolonial theoretical lens on the search for a distinctive architectural identity by architect Maclaine Pont in Indonesia in the early twentieth century. Influenced by progressive architectural thinking and enlightened humanism at the time, Pont applied the fundamental principles of Structural Rationalism by using a creative combination of traditional Indonesian architectural typology and innovative structural application. The interpretive design strategy also celebrated creative use of local building materials with sensible tropical climate design response. Moreover, his holistic architectural scheme, including inclusion of local custom of building construction, represents the notion of Gesamkunstwerk. By using such hybrid strategy, Maclaine Pont intended to preserve the essential cultural identity and vernacular architecture of the indigenous. The study will chronologically investigate the evolution of Structural Rationalism architecture philosophy of Viollet-le-Duc to Hendrik Berlage’s influential design thinking in the Dutch modern architecture, and subsequently to the Maclaine Pont’s innovative design in Indonesia. Consequently, the morphology analysis on his exemplary design works of ITB campus (1923) and Pohsarang Church (1936) is to understand the evolutionary influence of Structural Rationalism theory. The postmodern analysis method is to highlight the validity of Pont’s idea in the contemporary Indonesian architecture within the culture of globalism era.Keywords: Indonesian modern architecture, postcolonial, structural rationalism, critical regionalism
Procedia PDF Downloads 339404 Coding and Decoding versus Space Diversity for Rayleigh Fading Radio Frequency Channels
Authors: Ahmed Mahmoud Ahmed Abouelmagd
Abstract:
The diversity is the usual remedy of the transmitted signal level variations (Fading phenomena) in radio frequency channels. Diversity techniques utilize two or more copies of a signal and combine those signals to combat fading. The basic concept of diversity is to transmit the signal via several independent diversity branches to get independent signal replicas via time – frequency - space - and polarization diversity domains. Coding and decoding processes can be an alternative remedy for fading phenomena, it cannot increase the channel capacity, but it can improve the error performance. In this paper we propose the use of replication decoding with BCH code class, and Viterbi decoding algorithm with convolution coding; as examples of coding and decoding processes. The results are compared to those obtained from two optimized selection space diversity techniques. The performance of Rayleigh fading channel, as the model considered for radio frequency channels, is evaluated for each case. The evaluation results show that the coding and decoding approaches, especially the BCH coding approach with replication decoding scheme, give better performance compared to that of selection space diversity optimization approaches. Also, an approach for combining the coding and decoding diversity as well as the space diversity is considered, the main disadvantage of this approach is its complexity but it yields good performance results.Keywords: Rayleigh fading, diversity, BCH codes, Replication decoding, convolution coding, viterbi decoding, space diversity
Procedia PDF Downloads 443403 Application of Powder Metallurgy Technologies for Gas Turbine Engine Wheel Production
Authors: Liubov Magerramova, Eugene Kratt, Pavel Presniakov
Abstract:
A detailed analysis has been performed for several schemes of Gas Turbine Wheels production based on additive and powder technologies including metal, ceramic, and stereolithography 3-D printing. During the process of development and debugging of gas turbine engine components, different versions of these components must be manufactured and tested. Cooled blades of the turbine are among of these components. They are usually produced by traditional casting methods. This method requires long and costly design and manufacture of casting molds. Moreover, traditional manufacturing methods limit the design possibilities of complex critical parts of engine, so capabilities of Powder Metallurgy Techniques (PMT) were analyzed to manufacture the turbine wheel with air-cooled blades. PMT dramatically reduce time needed for such production and allow creating new complex design solutions aimed at improving the technical characteristics of the engine: improving fuel efficiency and environmental performance, increasing reliability, and reducing weight. To accelerate and simplify the blades manufacturing process, several options based on additive technologies were used. The options were implemented in the form of various casting equipment for the manufacturing of blades. Methods of powder metallurgy were applied for connecting the blades with the disc. The optimal production scheme and a set of technologies for the manufacturing of blades and turbine wheel and other parts of the engine can be selected on the basis of the options considered.Keywords: additive technologies, gas turbine engine, powder technology, turbine wheel
Procedia PDF Downloads 320402 The Intersection/Union Region Computation for Drosophila Brain Images Using Encoding Schemes Based on Multi-Core CPUs
Authors: Ming-Yang Guo, Cheng-Xian Wu, Wei-Xiang Chen, Chun-Yuan Lin, Yen-Jen Lin, Ann-Shyn Chiang
Abstract:
With more and more Drosophila Driver and Neuron images, it is an important work to find the similarity relationships among them as the functional inference. There is a general problem that how to find a Drosophila Driver image, which can cover a set of Drosophila Driver/Neuron images. In order to solve this problem, the intersection/union region for a set of images should be computed at first, then a comparison work is used to calculate the similarities between the region and other images. In this paper, three encoding schemes, namely Integer, Boolean, Decimal, are proposed to encode each image as a one-dimensional structure. Then, the intersection/union region from these images can be computed by using the compare operations, Boolean operators and lookup table method. Finally, the comparison work is done as the union region computation, and the similarity score can be calculated by the definition of Tanimoto coefficient. The above methods for the region computation are also implemented in the multi-core CPUs environment with the OpenMP. From the experimental results, in the encoding phase, the performance by the Boolean scheme is the best than that by others; in the region computation phase, the performance by Decimal is the best when the number of images is large. The speedup ratio can achieve 12 based on 16 CPUs. This work was supported by the Ministry of Science and Technology under the grant MOST 106-2221-E-182-070.Keywords: Drosophila driver image, Drosophila neuron images, intersection/union computation, parallel processing, OpenMP
Procedia PDF Downloads 239401 An Islamic Microfinance Business Model in Bangladesh and Its Role in Poverty Alleviation
Authors: Abul Hassan
Abstract:
Present socio-economic context and women wellbeing in Bangladesh imposes lots of constraints on women’s involvement in income generating activities. Different studies showed that the implementation of World Bank structural adjustment policies have had mixed impacts on women and their wellbeing. By involving poor people specially women in Islamic microfinance programmes in Bangladesh are used as a tool to combat poverty. Women are specifically targeted by Islamic microfinance under the rural development scheme of Islami Bank Bangladesh that provide interest free loan to the women groups. The programme has a multiplier effect since women invest largely in their households. The aim of this research is twofold: firstly, it wanted to confirm or refute a positive link between Islamic microfinance and the socio-economic wellbeing of women in Bangladesh and secondly, to explore the context in which Islamic microfinance programs function in Bangladesh and the way their performance can be improved. Based on structured questionnaires’ survey, this study addressed two research questions: (1) What can be expected from the offer of Islamic microfinance on the welfare of recipients and (2) Under what conditions would such an offer be more beneficial. The main result of this study shows that increase in women’s income and assets played a very important role in enhancing women’s economic independence and sense of self-confidence. An important policy recommendation is that it is necessary to redirect Islamic microfinance towards diversified developmental activities that will contribute to the improvement, in the long run, of the wellbeing of the recipients.Keywords: business model, Islamic microfinance, women’s wellbeing
Procedia PDF Downloads 388400 Soret and Dufour's Effects on Mixed Convection Unsteady MHD Boundary Layer Flow over a Stretching Sheet Embedded in a Porous Medium with Chemically Reactive Spices
Authors: Deva Kanta Phukan
Abstract:
An investigation is made to carry out to study the thermal-diffusion and diffusion thermo-effects in hydro-magnetic unsteady flow by a mixed convection boundary layer past an impermeable vertical stretching sheet embedded in a conducting fluid-saturated porous medium in the presence of a chemical reaction effect. The velocity of stretching surface, the surface temperature and the concentration are assumed to vary linearly with the distance along the surface. The governing partial differential equations are transformed in to self similar unsteady equations using similarity transformations and solved numerically by the Runge kutta fourth order scheme in association with the shooting method for the whole transient domain from the initial state to the final steady state flow. Numerical results for the velocity, temperature, the concentration, the skin friction , and the Nusselt and Sherwood numbers are shown graphically for various flow parameters. The results reveal that there is a smooth transition of flow from unsteady state to the final steady state. A special case of our results is in good agreement with an earlier published work.Keywords: heat and mass transfer, boundary layer flow, porous media, magnetic field, Soret number, Dufour’s number
Procedia PDF Downloads 445399 Fatty Acid Metabolism in Hypertension
Authors: Yin Hua Zhang
Abstract:
Cardiac metabolism is essential in myocardial contraction. In addition to glucose, fatty acids (FA) are essential in producing energy in the myocardium since FA-dependent beta-oxidation accounts for > 70-90% of cellular ATP under resting conditions. However, metabolism shifts from FAs to glucose utilization during disease progression (e.g. hypertrophy and ischemic myocardium), where glucose oxidation and glycolysis become the predominant sources of cellular ATP. At advanced failing stage, both glycolysis and beta-oxidation are dysregulated, result in insufficient supply of intracellular ATP and weakened myocardial contractility. Undeniably, our understandings of myocyte function in healthy and diseased hearts are based on glucose (10 mM)-dependent metabolism because glucose is the “sole” metabolic substrate in most of the physiological experiments. In view of the importance of FAs in cardiovascular health and diseases, we aimed to elucidate the impacts of FA supplementation on myocyte contractility and evaluate cellular mechanisms those mediate the functions in normal heart and with pathological stress. In particular, we have investigated cardiac excitation-contraction (E-C) coupling in the presence and absence of FAs in normal and hypertensive rat left ventricular (LV) myocytes. Our results reveal that FAs increase mitochondrial activity, intracellular [Ca²+]i, and LV myocyte contraction in healthy LV myocytes, whereas FA-dependent cardiac inotropyis attenuated in hypertension. FA-dependent myofilament Ca²+ desensitization could be fundamental in regulating [Ca²+]i. Collectively, FAs supplementation resets cardiac E-C coupling scheme in healthy and diseased hearts.Keywords: hypertension, fatty acid, heart, calcium
Procedia PDF Downloads 109398 Performance of BLDC Motor under Kalman Filter Sensorless Drive
Authors: Yuri Boiko, Ci Lin, Iluju Kiringa, Tet Yeap
Abstract:
The performance of a BLDC motor controlled by the Kalman filter-based position-sensorless drive is studied in terms of its dependence on the system’s parameters' variations. The effects of system’s parameters changes on the dynamic behavior of state variables are verified. Simulated is a closed-loop control scheme with a Kalman filter in the feedback line. Distinguished are two separate data sampling modes in analyzing feedback output from the BLDC motor: (1) equal angular separation and (2) equal time intervals. In case (1), the data are collected via equal intervals Δθ of rotor’s angular position θᵢ, i.e., keeping Δθ=const. In case (2), the data collection time points tᵢ are separated by equal sampling time intervals Δt=const. Demonstrated are the effects of the parameters changes on the sensorless control flow, in particular, reduction of the torque ripples, switching spikes, torque load balancing. It is specifically shown that an efficient suppression of commutation induced torque ripples is achievable selection of the sampling rate in the Kalman filter settings above certain critical value. The computational cost of such suppression is shown to be higher for the motors with lower induction values of the windings.Keywords: BLDC motor, Kalman filter, sensorless drive, state variables, torque ripples reduction, sampling rate
Procedia PDF Downloads 148397 External Noise Distillation in Quantum Holography with Undetected Light
Authors: Sebastian Töpfer, Jorge Fuenzalida, Marta Gilaberte Basset, Juan P. Torres, Markus Gräfe
Abstract:
This work presents an experimental and theoretical study about the noise resilience of quantum holography with undetected photons. Quantum imaging has become an important research topic in the recent years after its first publication in 2014. Following this research, advances towards different spectral ranges in detection and different optical geometries have been made. Especially an interest in the field of near infrared to mid infrared measurements has developed, because of the unique characteristic, that allows to sample a probe with photons in a different wavelength than the photons arriving at the detector. This promising effect can be used for medical applications, to measure in the so-called molecule fingerprint region, while using broadly available detectors for the visible spectral range. Further advance the development of quantum imaging methods have been made by new measurement and detection schemes. One of which is quantum holography with undetected light. It combines digital phase shifting holography with quantum imaging to extent the obtainable sample information, by measuring not only the object transmission, but also its influence on the phase shift experienced by the transmitted light. This work will present extended research for the quantum holography with undetected light scheme regarding the influence of external noise. It is shown experimentally and theoretically that the samples information can still be at noise levels of 250 times higher than the signal level, because of its information being transmitted by the interferometric pattern. A detailed theoretic explanation is also provided.Keywords: distillation, quantum holography, quantum imaging, quantum metrology
Procedia PDF Downloads 75396 Performance Gap and near Zero Energy Buildings Compliance of Monitored Passivhaus in Northern Ireland, the Republic of Ireland and Italy
Authors: S. Colclough, V. Costanzo, K. Fabbri, S. Piraccini, P. Griffiths
Abstract:
The near Zero Energy Building (nZEB) standard is required for all buildings from 2020. The Passive House (PH) standard is a well-established low-energy building standard, having been designed over 25 years ago, and could potentially be used to achieve the nZEB standard in combination with renewables. By comparing measured performance with design predictions, this paper considers if there is a performance gap for a number of monitored properties and assesses if the nZEB standard can be achieved by following the well-established PH scheme. Analysis is carried out based on monitoring results from real buildings located in Northern Ireland, the Republic of Ireland and Italy respectively, with particular focus on the indoor air quality including the assumed and measured indoor temperature and heating periods for both standards as recorded during a full annual cycle. An analysis is carried out also on the energy performance certificates of each of the dwellings to determine if they meet the near Zero Energy Buildings primary energy consumption targets set in the respective jurisdictions. Each of the dwellings is certified as complying with the passive house standard, and accordingly have very good insulation levels, heat recovery and ventilation systems of greater than 75% efficiency and an airtightness of less than 0.6 air changes per hour at 50 Pa. It is found that indoor temperature and relative humidity were within the comfort boundaries set in the design stage, while carbon dioxide concentrations are sometimes higher than the values suggested by EN 15251 Standard for comfort class I especially in bedrooms.Keywords: monitoring campaign, nZEB (near zero energy buildings), Passivhaus, performance gap
Procedia PDF Downloads 152395 Constructions of Linear and Robust Codes Based on Wavelet Decompositions
Authors: Alla Levina, Sergey Taranov
Abstract:
The classical approach to the providing noise immunity and integrity of information that process in computing devices and communication channels is to use linear codes. Linear codes have fast and efficient algorithms of encoding and decoding information, but this codes concentrate their detect and correct abilities in certain error configurations. To protect against any configuration of errors at predetermined probability can robust codes. This is accomplished by the use of perfect nonlinear and almost perfect nonlinear functions to calculate the code redundancy. The paper presents the error-correcting coding scheme using biorthogonal wavelet transform. Wavelet transform applied in various fields of science. Some of the wavelet applications are cleaning of signal from noise, data compression, spectral analysis of the signal components. The article suggests methods for constructing linear codes based on wavelet decomposition. For developed constructions we build generator and check matrix that contain the scaling function coefficients of wavelet. Based on linear wavelet codes we develop robust codes that provide uniform protection against all errors. In article we propose two constructions of robust code. The first class of robust code is based on multiplicative inverse in finite field. In the second robust code construction the redundancy part is a cube of information part. Also, this paper investigates the characteristics of proposed robust and linear codes.Keywords: robust code, linear code, wavelet decomposition, scaling function, error masking probability
Procedia PDF Downloads 489394 Water Diffusivity in Amorphous Epoxy Resins: An Autonomous Basin Climbing-Based Simulation Method
Authors: Betim Bahtiri, B. Arash, R. Rolfes
Abstract:
Epoxy-based materials are frequently exposed to high-humidity environments in many engineering applications. As a result, their material properties would be degraded by water absorption. A full characterization of the material properties under hygrothermal conditions requires time- and cost-consuming experimental tests. To gain insights into the physics of diffusion mechanisms, atomistic simulations have been shown to be effective tools. Concerning the diffusion of water in polymers, spatial trajectories of water molecules are obtained from molecular dynamics (MD) simulations allowing the interpretation of diffusion pathways at the nanoscale in a polymer network. Conventional MD simulations of water diffusion in amorphous polymers lead to discrepancies at low temperatures due to the short timescales of the simulations. In the proposed model, this issue is solved by using a combined scheme of autonomous basin climbing (ABC) with kinetic Monte Carlo and reactive MD simulations to investigate the diffusivity of water molecules in epoxy resins across a wide range of temperatures. It is shown that the proposed simulation framework estimates kinetic properties of water diffusion in epoxy resins that are consistent with experimental observations and provide a predictive tool for investigating the diffusion of small molecules in other amorphous polymers.Keywords: epoxy resins, water diffusion, autonomous basin climbing, kinetic Monte Carlo, reactive molecular dynamics
Procedia PDF Downloads 67393 Performance Comparison of Resource Allocation without Feedback in Wireless Body Area Networks by Various Pseudo Orthogonal Sequences
Authors: Ojin Kwon, Yong-Jin Yoon, Liu Xin, Zhang Hongbao
Abstract:
Wireless Body Area Network (WBAN) is a short-range wireless communication around human body for various applications such as wearable devices, entertainment, military, and especially medical devices. WBAN attracts the attention of continuous health monitoring system including diagnostic procedure, early detection of abnormal conditions, and prevention of emergency situations. Compared to cellular network, WBAN system is more difficult to control inter- and inner-cell interference due to the limited power, limited calculation capability, mobility of patient, and non-cooperation among WBANs. In this paper, we compare the performance of resource allocation scheme based on several Pseudo Orthogonal Codewords (POCs) to mitigate inter-WBAN interference. Previously, the POCs are widely exploited for a protocol sequence and optical orthogonal code. Each POCs have different properties of auto- and cross-correlation and spectral efficiency according to its construction of POCs. To identify different WBANs, several different pseudo orthogonal patterns based on POCs exploits for resource allocation of WBANs. By simulating these pseudo orthogonal resource allocations of WBANs on MATLAB, we obtain the performance of WBANs according to different POCs and can analyze and evaluate the suitability of POCs for the resource allocation in the WBANs system.Keywords: wireless body area network, body sensor network, resource allocation without feedback, interference mitigation, pseudo orthogonal pattern
Procedia PDF Downloads 353392 Efficacy of Deep Learning for Below-Canopy Reconstruction of Satellite and Aerial Sensing Point Clouds through Fractal Tree Symmetry
Authors: Dhanuj M. Gandikota
Abstract:
Sensor-derived three-dimensional (3D) point clouds of trees are invaluable in remote sensing analysis for the accurate measurement of key structural metrics, bio-inventory values, spatial planning/visualization, and ecological modeling. Machine learning (ML) holds the potential in addressing the restrictive tradeoffs in cost, spatial coverage, resolution, and information gain that exist in current point cloud sensing methods. Terrestrial laser scanning (TLS) remains the highest fidelity source of both canopy and below-canopy structural features, but usage is limited in both coverage and cost, requiring manual deployment to map out large, forested areas. While aerial laser scanning (ALS) remains a reliable avenue of LIDAR active remote sensing, ALS is also cost-restrictive in deployment methods. Space-borne photogrammetry from high-resolution satellite constellations is an avenue of passive remote sensing with promising viability in research for the accurate construction of vegetation 3-D point clouds. It provides both the lowest comparative cost and the largest spatial coverage across remote sensing methods. However, both space-borne photogrammetry and ALS demonstrate technical limitations in the capture of valuable below-canopy point cloud data. Looking to minimize these tradeoffs, we explored a class of powerful ML algorithms called Deep Learning (DL) that show promise in recent research on 3-D point cloud reconstruction and interpolation. Our research details the efficacy of applying these DL techniques to reconstruct accurate below-canopy point clouds from space-borne and aerial remote sensing through learned patterns of tree species fractal symmetry properties and the supplementation of locally sourced bio-inventory metrics. From our dataset, consisting of tree point clouds obtained from TLS, we deconstructed the point clouds of each tree into those that would be obtained through ALS and satellite photogrammetry of varying resolutions. We fed this ALS/satellite point cloud dataset, along with the simulated local bio-inventory metrics, into the DL point cloud reconstruction architectures to generate the full 3-D tree point clouds (the truth values are denoted by the full TLS tree point clouds containing the below-canopy information). Point cloud reconstruction accuracy was validated both through the measurement of error from the original TLS point clouds as well as the error of extraction of key structural metrics, such as crown base height, diameter above root crown, and leaf/wood volume. The results of this research additionally demonstrate the supplemental performance gain of using minimum locally sourced bio-inventory metric information as an input in ML systems to reach specified accuracy thresholds of tree point cloud reconstruction. This research provides insight into methods for the rapid, cost-effective, and accurate construction of below-canopy tree 3-D point clouds, as well as the supported potential of ML and DL to learn complex, unmodeled patterns of fractal tree growth symmetry.Keywords: deep learning, machine learning, satellite, photogrammetry, aerial laser scanning, terrestrial laser scanning, point cloud, fractal symmetry
Procedia PDF Downloads 103391 An Efficient Traceability Mechanism in the Audited Cloud Data Storage
Authors: Ramya P, Lino Abraham Varghese, S. Bose
Abstract:
By cloud storage services, the data can be stored in the cloud, and can be shared across multiple users. Due to the unexpected hardware/software failures and human errors, which make the data stored in the cloud be lost or corrupted easily it affected the integrity of data in cloud. Some mechanisms have been designed to allow both data owners and public verifiers to efficiently audit cloud data integrity without retrieving the entire data from the cloud server. But public auditing on the integrity of shared data with the existing mechanisms will unavoidably reveal confidential information such as identity of the person, to public verifiers. Here a privacy-preserving mechanism is proposed to support public auditing on shared data stored in the cloud. It uses group signatures to compute verification metadata needed to audit the correctness of shared data. The identity of the signer on each block in shared data is kept confidential from public verifiers, who are easily verifying shared data integrity without retrieving the entire file. But on demand, the signer of the each block is reveal to the owner alone. Group private key is generated once by the owner in the static group, where as in the dynamic group, the group private key is change when the users revoke from the group. When the users leave from the group the already signed blocks are resigned by cloud service provider instead of owner is efficiently handled by efficient proxy re-signature scheme.Keywords: data integrity, dynamic group, group signature, public auditing
Procedia PDF Downloads 392390 Smart Water Main Inspection and Condition Assessment Using a Systematic Approach for Pipes Selection
Authors: Reza Moslemi, Sebastien Perrier
Abstract:
Water infrastructure deterioration can result in increased operational costs owing to increased repair needs and non-revenue water and consequently cause a reduced level of service and customer service satisfaction. Various water main condition assessment technologies have been introduced to the market in order to evaluate the level of pipe deterioration and to develop appropriate asset management and pipe renewal plans. One of the challenges for any condition assessment and inspection program is to determine the percentage of the water network and the combination of pipe segments to be inspected in order to obtain a meaningful representation of the status of the entire water network with a desirable level of accuracy. Traditionally, condition assessment has been conducted by selecting pipes based on age or location. However, this may not necessarily offer the best approach, and it is believed that by using a smart sampling methodology, a better and more reliable estimate of the condition of a water network can be achieved. This research investigates three different sampling methodologies, including random, stratified, and systematic. It is demonstrated that selecting pipes based on the proposed clustering and sampling scheme can considerably improve the ability of the inspected subset to represent the condition of a wider network. With a smart sampling methodology, a smaller data sample can provide the same insight as a larger sample. This methodology offers increased efficiency and cost savings for condition assessment processes and projects.Keywords: condition assessment, pipe degradation, sampling, water main
Procedia PDF Downloads 150389 Technical, Environmental and Financial Assessment for Optimal Sizing of Run-of-River Small Hydropower Project: Case Study in Colombia
Authors: David Calderon Villegas, Thomas Kaltizky
Abstract:
Run-of-river (RoR) hydropower projects represent a viable, clean, and cost-effective alternative to dam-based plants and provide decentralized power production. However, RoR schemes cost-effectiveness depends on the proper selection of site and design flow, which is a challenging task because it requires multivariate analysis. In this respect, this study presents the development of an investment decision support tool for assessing the optimal size of an RoR scheme considering the technical, environmental, and cost constraints. The net present value (NPV) from a project perspective is used as an objective function for supporting the investment decision. The tool has been tested by applying it to an actual RoR project recently proposed in Colombia. The obtained results show that the optimum point in financial terms does not match the flow that maximizes energy generation from exploiting the river's available flow. For the case study, the flow that maximizes energy corresponds to a value of 5.1 m3/s. In comparison, an amount of 2.1 m3/s maximizes the investors NPV. Finally, a sensitivity analysis is performed to determine the NPV as a function of the debt rate changes and the electricity prices and the CapEx. Even for the worst-case scenario, the optimal size represents a positive business case with an NPV of 2.2 USD million and an IRR 1.5 times higher than the discount rate.Keywords: small hydropower, renewable energy, RoR schemes, optimal sizing, objective function
Procedia PDF Downloads 132