Search results for: application analysis
30518 X-Ray Crystallographic, Hirshfeld Surface Analysis and Docking Study of Phthalyl Sulfacetamide
Authors: Sanjay M. Tailor, Urmila H. Patel
Abstract:
Phthalyl Sulfacetamide belongs to well-known member of antimicrobial sulfonamide family. It is a potent antitumor drug. Structural characteristics of 4-amino-N-(2quinoxalinyl) benzene-sulfonamides (Phthalyl Sulfacetamide), C14H12N4O2S has been studied by method of X-ray crystallography. The compound crystallizes in monoclinic space group P21/n with unit cell parameters a= 7.9841 Ǻ, b= 12.8208 Ǻ, c= 16.6607 Ǻ, α= 90˚, β= 93.23˚, γ= 90˚and Z=4. The X-ray based three-dimensional structure analysis has been carried out by direct methods and refined to an R-value of 0.0419. The crystal structure is stabilized by intermolecular N-H…N, N-H…O and π-π interactions. The Hirshfeld surfaces and consequently the fingerprint analysis have been performed to study the nature of interactions and their quantitative contributions towards the crystal packing. An analysis of Hirshfeld surfaces and fingerprint plots facilitates a comparison of intermolecular interactions, which are the key elements in building different supramolecular architectures. Docking is used for virtual screening for the prediction of the strongest binders based on various scoring functions. Docking studies are carried out on Phthalyl Sulfacetamide for better activity, which is important for the development of a new class of inhibitors.Keywords: phthalyl sulfacetamide, crystal structure, hirshfeld surface analysis, docking
Procedia PDF Downloads 34730517 Static Light Scattering Method for the Analysis of Raw Cow's Milk
Authors: V. Villa-Cruz, H. Pérez-Ladron de Guevara, J. E. Diaz-Díaz
Abstract:
Static Light Scattering (SLS) was used as a method to analyse cow's milk raw, coming from the town of Lagos de Moreno, Jalisco, Mexico. This method is based on the analysis of the dispersion of light laser produced by a set of particles in solution. Based on the above, raw milk, which contains particles of fat globules, with a diameter of 2000 nm and particles of micelles of protein with 300 nm in diameter were analyzed. For this, dilutions of commercial milk were made (1.0%, 2.0% and 3.3%) to obtain a pattern of laser light scattering and also made measurements of raw cow's milk. Readings were taken in a sweep initial angle 10° to 170°, results were analyzed with the program OriginPro 7. The SLS method gives us an estimate of the percentage of fat content in milk samples. It can be concluded that the SLS method, is a quick method of analysis to detect adulteration in raw cow's milk.Keywords: light scattering, milk analysis, adulteration in milk, micelles, OriginPro
Procedia PDF Downloads 37530516 Dynamic Environmental Impact Study during the Construction of the French Nuclear Power Plants
Authors: A. Er-Raki, D. Hartmann, J. P. Belaud, S. Negny
Abstract:
This paper has a double purpose: firstly, a literature review of the life cycle analysis (LCA) and secondly a comparison between conventional (static) LCA and multi-level dynamic LCA on the following items: (i) inventories evolution with time (ii) temporal evolution of the databases. The first part of the paper summarizes the state of the art of the static LCA approach. The different static LCA limits have been identified and especially the non-consideration of the spatial and temporal evolution in the inventory, for the characterization factors (FCs) and into the databases. Then a description of the different levels of integration of the notion of temporality in life cycle analysis studies was made. In the second part, the dynamic inventory has been evaluated firstly for a single nuclear plant and secondly for the entire French nuclear power fleet by taking into account the construction durations of all the plants. In addition, the databases have been adapted by integrating the temporal variability of the French energy mix. Several iterations were used to converge towards the real environmental impact of the energy mix. Another adaptation of the databases to take into account the temporal evolution of the market data of the raw material was made. An identification of the energy mix of the time studied was based on an extrapolation of the production reference values of each means of production. An application to the construction of the French nuclear power plants from 1971 to 2000 has been performed, in which a dynamic inventory of raw material has been evaluated. Then the impacts were characterized by the ILCD 2011 characterization method. In order to compare with a purely static approach, a static impact assessment was made with the V 3.4 Ecoinvent data sheets without adaptation and a static inventory considering that all the power stations would have been built at the same time. Finally, a comparison between static and dynamic LCA approaches was set up to determine the gap between them for each of the two levels of integration. The results were analyzed to identify the contribution of the evolving nuclear power fleet construction to the total environmental impacts of the French energy mix during the same period. An equivalent strategy using a dynamic approach will further be applied to identify the environmental impacts that different scenarios of the energy transition could bring, allowing to choose the best energy mix from an environmental viewpoint.Keywords: LCA, static, dynamic, inventory, construction, nuclear energy, energy mix, energy transition
Procedia PDF Downloads 10530515 A Correlation Analysis of an Effective Music Education with Students’ Mathematical Performance
Authors: Yoon Suh Song
Abstract:
Though music education can broaden one’s capacity for mathematical performance, many countries lag behind in music education. Little empirical evidence is found to identify the connection between math and music. Therefore, this research was set out to explore what music-related variables are associated with mathematical performance. The result of our analysis is as follows: A Pearson's Correlation analysis revealed that PISA math score is strongly correlated with students' Intelligence Quotient (IQ). This lays the foundation for further research as to what factors in students’ IQ lead to a better performance in math.Keywords: music education, mathematical performance, education, IQ
Procedia PDF Downloads 21330514 Developing a Knowledge-Based Lean Six Sigma Model to Improve Healthcare Leadership Performance
Authors: Yousuf N. Al Khamisi, Eduardo M. Hernandez, Khurshid M. Khan
Abstract:
Purpose: This paper presents a model of a Knowledge-Based (KB) using Lean Six Sigma (L6σ) principles to enhance the performance of healthcare leadership. Design/methodology/approach: Using L6σ principles to enhance healthcare leaders’ performance needs a pre-assessment of the healthcare organisation’s capabilities. The model will be developed using a rule-based approach of KB system. Thus, KB system embeds Gauging Absence of Pre-requisite (GAP) for benchmarking and Analytical Hierarchy Process (AHP) for prioritization. A comprehensive literature review will be covered for the main contents of the model with a typical output of GAP analysis and AHP. Findings: The proposed KB system benchmarks the current position of healthcare leadership with the ideal benchmark one (resulting from extensive evaluation by the KB/GAP/AHP system of international leadership concepts in healthcare environments). Research limitations/implications: Future work includes validating the implementation model in healthcare environments around the world. Originality/value: This paper presents a novel application of a hybrid KB combines of GAP and AHP methodology. It implements L6σ principles to enhance healthcare performance. This approach assists healthcare leaders’ decision making to reach performance improvement against a best practice benchmark.Keywords: Lean Six Sigma (L6σ), Knowledge-Based System (KBS), healthcare leadership, Gauge Absence Prerequisites (GAP), Analytical Hierarchy Process (AHP)
Procedia PDF Downloads 16630513 Application of Microparticulated Whey Proteins in Reduced-Fat Yogurt through Hot-Extrusion: Influence on Physicochemical and Sensory Properties
Authors: M. K. Hossain, J. Keidel, O. Hensel, M. Diakite
Abstract:
Fat reduced dairy products are holding a potential market due to health reason. Due to less creamy, and pleasantness, reduced and/or low-fat dairy products are getting less consumer acceptance whereas the fat molecule provides smooth, creamy and a pleasant mouthfeel in dairy products especially yogurt & ice cream. This study was aimed to investigate whether the application of microparticulated whey proteins (MWPs) processed by extrusion cooking, the reduced fat yogurt can achieve similar or higher creaminess compared to whole milk (3.8% fat) and skimmed milk (0.5% fat) yogurt. Full cream and skimmed milk were used to prepare natural stirred yogurt, as well as the dry matter content, also adjusted up to 16% with skimmed milk powder. Whey protein concentrates (WPC80) were used to produce MWPs in particle size of d50 > 5 µm, d50 3<5 µm and d50 < 3 µm through the hot-extrusion process with a screw speed of 400, 600 and 1000 rpm respectively. Furthermore, the commercially available microparticulated whey protein called Simplesse® was also applied in order to compare with extruded MWPs. The rheological and sensory properties of yogurt were assessed, and data were analyzed statistically. The applications of extruded MWPs with 600 and 1000 rpm were achieved significantly (p < 0.05) higher creaminess and preference compared to the whole and skimmed milk yogurt whereas, 400 rpm got lower preference. On the other hand, Simplesse® obtained the lowest creaminess and preference compared to other yogurts, although the contribution of dry matter in yogurt was same as extruded MWPs. The creaminess and viscosities were strongly (r = 0.62) correlated, furthermore, the viscosity from sensory evaluation and the dynamic viscosity of yogurt was also significantly (r = 0.72) correlated which clarifies that the performance of sensory panelists as well as the quality of the products.Keywords: microparticulation, hot-extrusion, reduced-fat yogurt, whey protein concentrate
Procedia PDF Downloads 13030512 FRATSAN: A New Software for Fractal Analysis of Signals
Authors: Hamidreza Namazi
Abstract:
Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure
Procedia PDF Downloads 46730511 Investigating the Body Paragraphs of English as a Second Language Students' English Academic Essays: Genre Analysis and Needs Analysis
Authors: Chek K. Loi
Abstract:
The present study has two objectives. Firstly, it investigates the rhetorical strategies employed in the body paragraphs of ESL (English as a Second Language) undergraduate students’ English academic essays. Peacock’s (2002) model of the discussion section was used as the starting points in this study to investigate the rhetorical moves employed in the data. Secondly, it investigates the writing problems as perceived by these ESL students through an interview. Interview responses serve as accompanying data to the move analysis. Apart from this, students’ English academic writing problems are diagnosed. The findings have pedagogical implications in an EAP (English for Academic Purposes) classroom.Keywords: academic essays, move analysis, pedagogical implication, rhetorical strategies
Procedia PDF Downloads 27630510 Applying Critical Realism to Qualitative Social Work Research: A Critical Realist Approach for Social Work Thematic Analysis Method
Authors: Lynne Soon-Chean Park
Abstract:
Critical Realism (CR) has emerged as an alternative to both the positivist and constructivist perspectives that have long dominated social work research. By unpacking the epistemic weakness of two dogmatic perspectives, CR provides a useful philosophical approach that incorporates the ontological objectivist and subjectivist stance. The CR perspective suggests an alternative approach for social work researchers who have long been looking to engage in the complex interplay between perceived reality at the empirical level and the objective reality that lies behind the empirical event as a causal mechanism. However, despite the usefulness of CR in informing social work research, little practical guidance is available about how CR can inform methodological considerations in social work research studies. This presentation aims to provide a detailed description of CR-informed thematic analysis by drawing examples from a social work doctoral research of Korean migrants’ experiences and understanding of trust associated with their settlement experience in New Zealand. Because of its theoretical flexibility and accessibility as a qualitative analysis method, thematic analysis can be applied as a method that works both to search for the demi-regularities of the collected data and to identify the causal mechanisms that lay behind the empirical data. In so doing, this presentation seeks to provide a concrete and detailed exemplar for social work researchers wishing to employ CR in their qualitative thematic analysis process.Keywords: critical Realism, data analysis, epistemology, research methodology, social work research, thematic analysis
Procedia PDF Downloads 21230509 Performance Analysis with the Combination of Visualization and Classification Technique for Medical Chatbot
Authors: Shajida M., Sakthiyadharshini N. P., Kamalesh S., Aswitha B.
Abstract:
Natural Language Processing (NLP) continues to play a strategic part in complaint discovery and medicine discovery during the current epidemic. This abstract provides an overview of performance analysis with a combination of visualization and classification techniques of NLP for a medical chatbot. Sentiment analysis is an important aspect of NLP that is used to determine the emotional tone behind a piece of text. This technique has been applied to various domains, including medical chatbots. In this, we have compared the combination of the decision tree with heatmap and Naïve Bayes with Word Cloud. The performance of the chatbot was evaluated using accuracy, and the results indicate that the combination of visualization and classification techniques significantly improves the chatbot's performance.Keywords: sentimental analysis, NLP, medical chatbot, decision tree, heatmap, naïve bayes, word cloud
Procedia PDF Downloads 7430508 Multi-Agent TeleRobotic Security Control System: Requirements Definitions of Multi-Agent System Using The Behavioral Patterns Analysis (BPA) Approach
Authors: Assem El-Ansary
Abstract:
This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach in developing an Multi-Agent TeleRobotic Security Control System (MTSCS). The event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are the Behavioral Pattern Analysis (BPA) modeling methodology, and the development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.Keywords: analysis, multi-agent, TeleRobotics control, security, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases
Procedia PDF Downloads 43830507 Simulation Analysis of Wavelength/Time/Space Codes Using CSRZ and DPSK-RZ Formats for Fiber-Optic CDMA Systems
Authors: Jaswinder Singh
Abstract:
In this paper, comparative analysis is carried out to study the performance of wavelength/time/space optical CDMA codes using two well-known formats; those are CSRZ and DPSK-RZ using RSoft’s OptSIM. The analysis is carried out under the real-like scenario considering the presence of various non-linear effects such as XPM, SPM, SRS, SBS and FWM. Fiber dispersion and the multiple access interference are also considered. The codes used in this analysis are 3-D wavelength/time/space codes. These are converted into 2-D wavelength-time codes so that their requirement of space couplers and fiber ribbons is eliminated. Under the conditions simulated, this is found that CSRZ performs better than DPSK-RZ for fiber-optic CDMA applications.Keywords: Optical CDMA, Multiple access interference (MAI), CSRZ, DPSK-RZ
Procedia PDF Downloads 64530506 Seismic Performance Evaluation of Existing Building Using Structural Information Modeling
Authors: Byungmin Cho, Dongchul Lee, Taejin Kim, Minhee Lee
Abstract:
The procedure for the seismic retrofit of existing buildings includes the seismic evaluation. In the evaluation step, it is assessed whether the buildings have satisfactory performance against seismic load. Based on the results of that, the buildings are upgraded. To evaluate seismic performance of the buildings, it usually goes through the model transformation from elastic analysis to inelastic analysis. However, when the data is not delivered through the interwork, engineers should manually input the data. In this process, since it leads to inaccuracy and loss of information, the results of the analysis become less accurate. Therefore, in this study, the process for the seismic evaluation of existing buildings using structural information modeling is suggested. This structural information modeling makes the work economic and accurate. To this end, it is determined which part of the process could be computerized through the investigation of the process for the seismic evaluation based on ASCE 41. The structural information modeling process is developed to apply to the seismic evaluation using Perform 3D program usually used for the nonlinear response history analysis. To validate this process, the seismic performance of an existing building is investigated.Keywords: existing building, nonlinear analysis, seismic performance, structural information modeling
Procedia PDF Downloads 38630505 Solution of Hybrid Fuzzy Differential Equations
Authors: Mahmood Otadi, Maryam Mosleh
Abstract:
The hybrid differential equations have a wide range of applications in science and engineering. In this paper, the homotopy analysis method (HAM) is applied to obtain the series solution of the hybrid differential equations. Using the homotopy analysis method, it is possible to find the exact solution or an approximate solution of the problem. Comparisons are made between improved predictor-corrector method, homotopy analysis method and the exact solution. Finally, we illustrate our approach by some numerical example.Keywords: fuzzy number, fuzzy ODE, HAM, approximate method
Procedia PDF Downloads 51130504 Statistical Analysis for Overdispersed Medical Count Data
Authors: Y. N. Phang, E. F. Loh
Abstract:
Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling over-dispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling over-dispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling over-dispersed medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling over-dispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian, and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling over-dispersed medical count data when ZIP and ZINB are inadequate.Keywords: zero inflated, inverse trinomial distribution, Poisson inverse Gaussian distribution, strict arcsine distribution, Pearson’s goodness of fit
Procedia PDF Downloads 54430503 From Madrassah to Elite Schools; The Political Economy of Pluralistic Educational Systems in Pakistan
Authors: Ahmad Zia
Abstract:
This study problematizes the notion that the pluralistic educational system in Pakistan fosters equality. Instead, it argues that this system not only reflects but also sustains existing class divisions, with implications for the future economic and social mobility of children. The primary goal of this study is to explore unequal access to educational opportunities in Pakistan. By examining the intersection between education and socioeconomic status, it attempts to explore the implications of key disparities in different tiers of education systems in Pakistan like between madrassahs, public schools and private schools, with an emphasis on how these institutions contribute to the maintenance of class hierarchies. This is a primary data based case study and the most recent data has been directly gathered Qualitative methods have been used to collect data from the units of data collection (UDCs). it have used Bourdieu’s theory as a leading framework. Its application in the context of country like Pakistan is very productive. it choose the thematic analysis method to analyse the data. This process helped me to identify relevant main themes and subthemes emerging from my data, which could comprise my analysis. Findings reveal that the educational landscape in Pakistan is deeply divided having far-reaching implications for social mobility and access to opportunities. This study found profound disparities among various educational institutions with respect to widening socioeconomic divides. Every kind of educational institution operates in a distinct socio-cultural and economic environment. Therefore, access to quality education is highly stratified and remains a privilege for only those who can afford it. This widens the socioeconomic gap that already exists. There has not been an extensive investigation of the relationship between pluralistic educations with class stratification in the literature so far. This study adds to a multifaceted understanding of educational disparities in Pakistan by analysing the intersections between socioeconomic divisions and educational access. It offers valuable theoretical and practical insights into the subject. This study provides theoretical concepts and empirical data to enhance scholars' understanding of socioeconomic inequality, specifically in relation to education systems.Keywords: social inequality, pluralism, class divide, capitalism, globalisation, elitism, education
Procedia PDF Downloads 1030502 Spectral Coherence Analysis between Grinding Interaction Forces and the Relative Motion of the Workpiece and the Cutting Tool
Authors: Abdulhamit Donder, Erhan Ilhan Konukseven
Abstract:
Grinding operation is performed in order to obtain desired surfaces precisely in machining process. The needed relative motion between the cutting tool and the workpiece is generally created either by the movement of the cutting tool or by the movement of the workpiece or by the movement of both of them as in our case. For all these cases, the coherence level between the movements and the interaction forces is a key influential parameter for efficient grinding. Therefore, in this work, spectral coherence analysis has been performed to investigate the coherence level between grinding interaction forces and the movement of the workpiece on our robotic-grinding experimental setup in METU Mechatronics Laboratory.Keywords: coherence analysis, correlation, FFT, grinding, hanning window, machining, Piezo actuator, reverse arrangements test, spectral analysis
Procedia PDF Downloads 40530501 Application of the Electrical Resistivity Tomography and Tunnel Seismic Prediction 303 Methods for Detection Fracture Zones Ahead of Tunnel: A Case Study
Authors: Nima Dastanboo, Xiao-Qing Li, Hamed Gharibdoost
Abstract:
The purpose of this study is to investigate about the geological properties ahead of a tunnel face with using Electrical Resistivity Tomography ERT and Tunnel Seismic Prediction TSP303 methods. In deep tunnels with hydro-geological conditions, it is important to study the geological structures of the region before excavating tunnels. Otherwise, it would lead to unexpected accidents that impose serious damage to the project. For constructing Nosoud tunnel in west of Iran, the ERT and TSP303 methods are employed to predict the geological conditions dynamically during the excavation. In this paper, based on the engineering background of Nosoud tunnel, the important results of applying these methods are discussed. This work demonstrates seismic method and electrical tomography as two geophysical techniques that are able to detect a tunnel. The results of these two methods were being in agreement with each other but the results of TSP303 are more accurate and quality. In this case, the TSP 303 method was a useful tool for predicting unstable geological structures ahead of the tunnel face during excavation. Thus, using another geophysical method together with TSP303 could be helpful as a decision support in excavating, especially in complicated geological conditions.Keywords: tunnel seismic prediction (TSP303), electrical resistivity tomography (ERT), seismic wave, velocity analysis, low-velocity zones
Procedia PDF Downloads 14930500 Characterization of Mg/Sc System for X-Ray Spectroscopy in the Water Window Range
Authors: Hina Verma, Karine Le Guen, Mohammed H. Modi, Rajnish Dhawan, Philippe Jonnard
Abstract:
Periodic multilayer mirrors have potential application as optical components in X-ray microscopy, particularly working in the water window region. The water window range, located between the absorption edges of carbon (285 eV) and oxygen (530eV), along with the presence of nitrogen K absorption edge (395 eV), makes it a powerful method for imaging biological samples due to the natural optical contrast between water and carbon. We characterized bilayer, trilayer, quadrilayer, and multilayer systems of Mg/Sc with ZrC thin layers introduced as a barrier layer and capping layer prepared by ion beam sputtering. The introduction of ZrC as a barrier layer is expected to improve the structure of the Mg/Sc system. The ZrC capping layer also prevents the stack from oxidation. The structural analysis of the Mg/Sc systems was carried out by using grazing incidence X-ray reflectivity (GIXRR) to obtain non-destructively a first description of the structural parameters, thickness, roughness, and density of the layers. Resonant soft X-ray reflectivity measurements in the vicinity of Sc L-absorption edge were performed to investigate and quantify the atomic distribution of deposited layers. Near absorption edge, the atomic scattering factor of an element changes sharply depending on its chemical environment inside the structure.Keywords: buried interfaces, resonant soft X-ray reflectivity, X-ray optics, X-ray reflectivity
Procedia PDF Downloads 17830499 Crossing Multi-Source Climate Data to Estimate the Effects of Climate Change on Evapotranspiration Data: Application to the French Central Region
Authors: Bensaid A., Mostephaoui T., Nedjai R.
Abstract:
Climatic factors are the subject of considerable research, both methodologically and instrumentally. Under the effect of climate change, the approach to climate parameters with precision remains one of the main objectives of the scientific community. This is from the perspective of assessing climate change and its repercussions on humans and the environment. However, many regions of the world suffer from a severe lack of reliable instruments that can make up for this deficit. Alternatively, the use of empirical methods becomes the only way to assess certain parameters that can act as climate indicators. Several scientific methods are used for the evaluation of evapotranspiration which leads to its evaluation either directly at the level of the climatic stations or by empirical methods. All these methods make a point approach and, in no case, allow the spatial variation of this parameter. We, therefore, propose in this paper the use of three sources of information (network of weather stations of Meteo France, World Databases, and Moodis satellite images) to evaluate spatial evapotranspiration (ETP) using the Turc method. This first step will reflect the degree of relevance of the indirect (satellite) methods and their generalization to sites without stations. The spatial variation representation of this parameter using the geographical information system (GIS) accounts for the heterogeneity of the behaviour of this parameter. This heterogeneity is due to the influence of site morphological factors and will make it possible to appreciate the role of certain topographic and hydrological parameters. A phase of predicting the evolution over the medium and long term of evapotranspiration under the effect of climate change by the application of the Intergovernmental Panel on Climate Change (IPCC) scenarios gives a realistic overview as to the contribution of aquatic systems to the scale of the region.Keywords: climate change, ETP, MODIS, GIEC scenarios
Procedia PDF Downloads 10030498 Research on Urban Thermal Environment Climate Map Based on GIS: Taking Shapingba District, Chongqing as an Example
Authors: Zhao Haoyue
Abstract:
Due to the combined effects of climate change, urban expansion, and population growth, various environmental issues, such as urban heat islands and pollution, arise. Therefore, reliable information on urban environmental climate is needed to address and mitigate the negative effects. The emergence of urban climate maps provides a practical basis for urban climate regulation and improvement. This article takes Shapingba District, Chongqing City, as an example to study the construction method of urban thermal environment climate maps based on GIS spatial analysis technology. The thermal load, ventilation potential analysis map, and thermal environment comprehensive analysis map were obtained. Based on the classification criteria obtained from the climate map, corresponding protection and planning mitigation measures have been proposed.Keywords: urban climate, GIS, heat island analysis, urban thermal environment
Procedia PDF Downloads 11330497 Allelopathic Potential of Canola and Wheat to Control Weeds in Soybean (Glycine max)
Authors: Alireza Dadkhah
Abstract:
A filed experiment was done to develop management practices to reduce the use of synthetic herbicides, in the arid and semi-arid agricultural ecosystems of north east of Iran. Five treatments including I: chopped residues of canola (Brasica vulgaris), II: chopped residues of wheat (Triticum aestivum) both were separately incorporated to 25 cm depth soil, 20 days before sowing, III: shoot aqueous extract of canola, IV: shoot aqueous extract of wheat which were separately sprayed at post emergence stage and V: without any residues and spraying as control. The weed control treatments reduced the total weed cover, weed density and biomass of weed. The reduction in weed density with canola and wheat residues incorporation were up to 67.5 and 62.2% respectively, at 40 days after sowing and 65.3% and 75.6%, respectively, at 90 days after sowing, compared to control. However, post emergence spraying of shoot aqueous extract of canola and wheat, suppressed weed density up to 41.8 and 36.6% at 40 days after sowing and 54.2% and 52.7% at 90 days after sowing respectively, compared to control. Weed control treatments reduced weed cover (%), weed biomass and weeds stem length. Incorporation of canola and wheat residues in soil reduced weed cover (%) by 62.5% and 63% respectively, while spraying of shoot water extract of canola and wheat suppressed weed cover (%) by 39.6% and 40.4% respectively at 90 days after sowing. Application of canola and wheat residues increased soybean yield by 45.4% and 69.5% respectively, compared to control while post emergence application of shoot aqueous extract of canola and wheat increased soybean yield by 22% and 29.8% respectively.Keywords: allelopathy, Bio-herbicide, Brassica oleracea, plant residues, Triticum aestivum
Procedia PDF Downloads 68430496 Traditional Drawing, BIM and Erudite Design Process
Authors: Maryam Kalkatechi
Abstract:
Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.Keywords: erudite, data-sketch, algorithm design in architecture, design process
Procedia PDF Downloads 27630495 Synthesising Smart City and Smart Port Concepts: A Conceptualization for Small and Medium-Sized Port City Ecosystems
Authors: Christopher Meyer, Laima Gerlitz
Abstract:
European Ports are about to take an important step towards their future economic development. Existing legislatives such as the European Green Deal are changing the perspective on ports as individual logistic institutions and demand a more holistic view on ports in their characteristic as ecosystem involving several different actors in an interdisciplinary and multilevel approach. A special role is taken by small and medium-sized ports facing the same political restriction and future goals - such as reducing environmental impacts with 2030 and 2050 as targets - while suffering from low financing capacity, outdated infrastructure, low innovation measures and missing political support. In contrast, they are playing a key role in regional economic development and cross-border logistics as well as facilitator for the regional hinterland. Also, in comparison to their big counterparts, small and medium-sized ports are often located within or close to city areas. This does not only bear more challenges especially when it comes to the environmental performance, but can also carry out growth potentials by putting the city as a key actor into the port ecosystem. For city development, the Smart City concept is one of the key strategies currently applied mostly on demonstration level in selected cities. Hence, the basic idea behind is par to the Smart Port concept. Thus, this paper is analysing potential synergetic effects resulting from the application of Smart City and Smart Port concepts for small and medium-sized ports' ecosystems closely located to cities with focus on innovation application, greening measurements and economic performances as well as strategic positioning of the ports in Smart City initiatives.Keywords: port-city ecosystems, regional development, sustainability transition, innovation policy
Procedia PDF Downloads 7830494 Methods for Restricting Unwanted Access on the Networks Using Firewall
Authors: Bhagwant Singh, Sikander Singh Cheema
Abstract:
This paper examines firewall mechanisms routinely implemented for network security in depth. A firewall can't protect you against all the hazards of unauthorized networks. Consequently, many kinds of infrastructure are employed to establish a secure network. Firewall strategies have already been the subject of significant analysis. This study's primary purpose is to avoid unnecessary connections by combining the capability of the firewall with the use of additional firewall mechanisms, which include packet filtering and NAT, VPNs, and backdoor solutions. There are insufficient studies on firewall potential and combined approaches, but there aren't many. The research team's goal is to build a safe network by integrating firewall strength and firewall methods. The study's findings indicate that the recommended concept can form a reliable network. This study examines the characteristics of network security and the primary danger, synthesizes existing domestic and foreign firewall technologies, and discusses the theories, benefits, and disadvantages of different firewalls. Through synthesis and comparison of various techniques, as well as an in-depth examination of the primary factors that affect firewall effectiveness, this study investigated firewall technology's current application in computer network security, then introduced a new technique named "tight coupling firewall." Eventually, the article discusses the current state of firewall technology as well as the direction in which it is developing.Keywords: firewall strategies, firewall potential, packet filtering, NAT, VPN, proxy services, firewall techniques
Procedia PDF Downloads 10130493 Stochastic Pi Calculus in Financial Markets: An Alternate Approach to High Frequency Trading
Authors: Jerome Joshi
Abstract:
The paper presents the modelling of financial markets using the Stochastic Pi Calculus model. The Stochastic Pi Calculus model is mainly used for biological applications; however, the feature of this model promotes its use in financial markets, more prominently in high frequency trading. The trading system can be broadly classified into exchange, market makers or intermediary traders and fundamental traders. The exchange is where the action of the trade is executed, and the two types of traders act as market participants in the exchange. High frequency trading, with its complex networks and numerous market participants (intermediary and fundamental traders) poses a difficulty while modelling. It involves the participants to seek the advantage of complex trading algorithms and high execution speeds to carry out large volumes of trades. To earn profits from each trade, the trader must be at the top of the order book quite frequently by executing or processing multiple trades simultaneously. This would require highly automated systems as well as the right sentiment to outperform other traders. However, always being at the top of the book is also not best for the trader, since it was the reason for the outbreak of the ‘Hot – Potato Effect,’ which in turn demands for a better and more efficient model. The characteristics of the model should be such that it should be flexible and have diverse applications. Therefore, a model which has its application in a similar field characterized by such difficulty should be chosen. It should also be flexible in its simulation so that it can be further extended and adapted for future research as well as be equipped with certain tools so that it can be perfectly used in the field of finance. In this case, the Stochastic Pi Calculus model seems to be an ideal fit for financial applications, owing to its expertise in the field of biology. It is an extension of the original Pi Calculus model and acts as a solution and an alternative to the previously flawed algorithm, provided the application of this model is further extended. This model would focus on solving the problem which led to the ‘Flash Crash’ which is the ‘Hot –Potato Effect.’ The model consists of small sub-systems, which can be integrated to form a large system. It is designed in way such that the behavior of ‘noise traders’ is considered as a random process or noise in the system. While modelling, to get a better understanding of the problem, a broader picture is taken into consideration with the trader, the system, and the market participants. The paper goes on to explain trading in exchanges, types of traders, high frequency trading, ‘Flash Crash,’ ‘Hot-Potato Effect,’ evaluation of orders and time delay in further detail. For the future, there is a need to focus on the calibration of the module so that they would interact perfectly with other modules. This model, with its application extended, would provide a basis for researchers for further research in the field of finance and computing.Keywords: concurrent computing, high frequency trading, financial markets, stochastic pi calculus
Procedia PDF Downloads 7730492 Evaluation of Low-Global Warming Potential Refrigerants in Vapor Compression Heat Pumps
Authors: Hamed Jafargholi
Abstract:
Global warming presents an immense environmental risk, causing detrimental impacts on ecological systems and putting coastal areas at risk. Implementing efficient measures to minimize greenhouse gas emissions and the use of fossil fuels is essential to reducing global warming. Vapor compression heat pumps provide a practical method for harnessing energy from waste heat sources and reducing energy consumption. However, traditional working fluids used in these heat pumps generally contain a significant global warming potential (GWP), which might cause severe greenhouse effects if they are released. The goal of the emphasis on low-GWP (below 150) refrigerants is to further the vapor compression heat pumps. A classification system for vapor compression heat pumps is offered, with different boundaries based on the needed heat temperature and advancements in heat pump technology. A heat pump could be classified as a low temperature heat pump (LTHP), medium temperature heat pump (MTHP), high temperature heat pump (HTHP), or ultra-high temperature heat pump (UHTHP). The HTHP/UHTHP border is 160 °C, the MTHP/HTHP and LTHP/MTHP limits are 100 and 60 °C, respectively. The refrigerant is one of the most important parts of a vapor compression heat pump system. Presently, the main ways to choose a refrigerant are based on ozone depletion potential (ODP) and GWP, with GWP being the lowest possible value and ODP being zero. Pure low-GWP refrigerants, such as natural refrigerants (R718 and R744), hydrocarbons (R290, R600), hydrofluorocarbons (R152a and R161), hydrofluoroolefins (R1234yf, R1234ze(E)), and hydrochlorofluoroolefin (R1233zd(E)), were selected as candidates for vapor compression heat pump systems based on these selection principles. The performance, characteristics, and potential uses of these low-GWP refrigerants in heat pump systems are investigated in this paper. As vapor compression heat pumps with pure low-GWP refrigerants become more common, more and more low-grade heat can be recovered. This means that energy consumption would decrease. The research outputs showed that the refrigerants R718 for UHTHP application, R1233zd(E) for HTHP application, R600, R152a, R161, R1234ze(E) for MTHP, and R744, R290, and R1234yf for LTHP application are appropriate. The selection of an appropriate refrigerant should, in fact, take into consideration two different environmental and thermodynamic points of view. It might be argued that, depending on the situation, a trade-off between these two groups should constantly be considered. The environmental approach is now far stronger than it was previously, according to the European Union regulations. This will promote sustainable energy consumption and social development in addition to assisting in the reduction of greenhouse gas emissions and the management of global warming.Keywords: vapor compression, global warming potential, heat pumps, greenhouse
Procedia PDF Downloads 3530491 Value-Based Argumentation Frameworks and Judicial Moral Reasoning
Authors: Sonia Anand Knowlton
Abstract:
As the use of Artificial Intelligence is becoming increasingly integrated in virtually every area of life, the need and interest to logically formalize the law and judicial reasoning is growing tremendously. The study of argumentation frameworks (AFs) provides promise in this respect. AF’s provide a way of structuring human reasoning using a formal system of non-monotonic logic. P.M. Dung first introduced this framework and demonstrated that certain arguments must prevail and certain arguments must perish based on whether they are logically “attacked” by other arguments. Dung labelled the set of prevailing arguments as the “preferred extension” of the given argumentation framework. Trevor Bench-Capon’s Value-based Argumentation Frameworks extended Dung’s AF system by allowing arguments to derive their force from the promotion of “preferred” values. In VAF systems, the success of an attack from argument A to argument B (i.e., the triumph of argument A) requires that argument B does not promote a value that is preferred to argument A. There has been thorough discussion of the application of VAFs to the law within the computer science literature, mainly demonstrating that legal cases can be effectively mapped out using VAFs. This article analyses VAFs from a jurisprudential standpoint to provide a philosophical and theoretical analysis of what VAFs tell the legal community about the judicial reasoning, specifically distinguishing between legal and moral reasoning. It highlights the limitations of using VAFs to account for judicial moral reasoning in theory and in practice.Keywords: nonmonotonic logic, legal formalization, computer science, artificial intelligence, morality
Procedia PDF Downloads 7430490 Studying the Bond Strength of Geo-Polymer Concrete
Authors: Rama Seshu Doguparti
Abstract:
This paper presents the experimental investigation on the bond behavior of geo polymer concrete. The bond behavior of geo polymer concrete cubes of grade M35 reinforced with 16 mm TMT rod is analyzed. The results indicate that the bond performance of reinforced geo polymer concrete is good and thus proves its application for construction.Keywords: geo-polymer, concrete, bond strength, behaviour
Procedia PDF Downloads 50930489 Influence of Deposition Temperature on Supercapacitive Properties of Reduced Graphene Oxide on Carbon Cloth: New Generation of Wearable Energy Storage Electrode Material
Authors: Snehal L. Kadam, Shriniwas B. Kulkarni
Abstract:
Flexible electrode material with high surface area and good electrochemical properties is the current trend captivating the researchers across globe for application in the next generation energy storage field. In the present work, crumpled sheet like reduced graphene oxide grown on carbon cloth by the hydrothermal method with a series of different deposition temperatures at fixed time. The influence of the deposition temperature on the structural, morphological, optical and supercapacitive properties of the electrode material was investigated by XRD, RAMAN, XPS, TEM, FE-SEM, UV-VISIBLE and electrochemical characterization techniques.The results show that the hydrothermally synthesized reduced graphene oxide on carbon cloth has sheet like mesoporous structure. The reduced graphene oxide material at 160°C exhibits the best supercapacitor performance, with a specific capacitance of 443 F/g at scan rate 5mV/sec. Moreover, stability studies show 97% capacitance retention over 1000 CV cycles. This result shows that hydrothermally synthesized RGO on carbon cloth is the potential electrode material and would be used in the next-generation wearable energy storage systems. The detailed analysis and results will be presented at the conference.Keywords: graphene oxide, reduced graphene oxide, carbon cloth, deposition temperature, supercapacitor
Procedia PDF Downloads 191