Search results for: panel data method
34920 A General Iterative Nonlinear Programming Method to Synthesize Heat Exchanger Network
Authors: Rupu Yang, Cong Toan Tran, Assaad Zoughaib
Abstract:
The work provides an iterative nonlinear programming method to synthesize a heat exchanger network by manipulating the trade-offs between the heat load of process heat exchangers (HEs) and utilities. We consider for the synthesis problem two cases, the first one without fixed cost for HEs, and the second one with fixed cost. For the no fixed cost problem, the nonlinear programming (NLP) model with all the potential HEs is optimized to obtain the global optimum. For the case with fixed cost, the NLP model is iterated through adding/removing HEs. The method was applied in five case studies and illustrated quite well effectiveness. Among which, the approach reaches the lowest TAC (2,904,026$/year) compared with the best record for the famous Aromatic plants problem. It also locates a slightly better design than records in literature for a 10 streams case without fixed cost with only 1/9 computational time. Moreover, compared to the traditional mixed-integer nonlinear programming approach, the iterative NLP method opens a possibility to consider constraints (such as controllability or dynamic performances) that require knowing the structure of the network to be calculated.Keywords: heat exchanger network, synthesis, NLP, optimization
Procedia PDF Downloads 16434919 Statistically Accurate Synthetic Data Generation for Enhanced Traffic Predictive Modeling Using Generative Adversarial Networks and Long Short-Term Memory
Authors: Srinivas Peri, Siva Abhishek Sirivella, Tejaswini Kallakuri, Uzair Ahmad
Abstract:
Effective traffic management and infrastructure planning are crucial for the development of smart cities and intelligent transportation systems. This study addresses the challenge of data scarcity by generating realistic synthetic traffic data using the PeMS-Bay dataset, improving the accuracy and reliability of predictive modeling. Advanced synthetic data generation techniques, including TimeGAN, GaussianCopula, and PAR Synthesizer, are employed to produce synthetic data that replicates the statistical and structural characteristics of real-world traffic. Future integration of Spatial-Temporal Generative Adversarial Networks (ST-GAN) is planned to capture both spatial and temporal correlations, further improving data quality and realism. The performance of each synthetic data generation model is evaluated against real-world data to identify the best models for accurately replicating traffic patterns. Long Short-Term Memory (LSTM) networks are utilized to model and predict complex temporal dependencies within traffic patterns. This comprehensive approach aims to pinpoint areas with low vehicle counts, uncover underlying traffic issues, and inform targeted infrastructure interventions. By combining GAN-based synthetic data generation with LSTM-based traffic modeling, this study supports data-driven decision-making that enhances urban mobility, safety, and the overall efficiency of city planning initiatives.Keywords: GAN, long short-term memory, synthetic data generation, traffic management
Procedia PDF Downloads 2734918 The Meaning Structures of Political Participation of Young Women: Preliminary Findings in a Practical Phenomenology Study
Authors: Amanda Aliende da Matta, Maria del Pilar Fogueiras Bertomeu, Valeria de Ormaechea Otalora, Maria Paz Sandin Esteban, Miriam Comet Donoso
Abstract:
This communication presents the preliminary emerging themes in a research on political participation of young women. The study follows a qualitative methodology; in particular, the applied hermeneutic phenomenological method, and the general objective of the research is to give an account of the experience of political participation as young women. The study participants are women aged 18 to 35 who have experience in political participation. The techniques of data collection are the descriptive story and the phenomenological interview. With respect to the first methodological steps, these have been: 1) collect and select stories of lived experience in political participation, 2) select descriptions of lived experience (DLEs) in political participation of the chosen stories, 3) to prepare phenomenological interviews from the selected DLEs, 4) to conduct phenomenological thematic analysis (PTA) of the DLEs. We have so far initiated the PTA on 5 vignettes. Hermeneutic phenomenology as a research approach is based on phenomenological philosophy and applied hermeneutics. Phenomenology is a descriptive philosophy of pure experience and essences, through which we seek to capture an experience at its origins without categorizing, interpreting or theorizing it. Hermeneutics, on the other hand, may be defined as a philosophical current that can be applied to data analysis. Max Van Manen wrote that hermeneutic phenomenology is a method of abstemious reflection on the basic structures of the lived experience of human existence. In hermeneutic phenomenology we focus, then, on the way we experience “things” in the first person, seeking to capture the world exactly as we experience it, not as we categorize or conceptualize it. In this study, the empirical methods used were: Lived experience description (written) and conversational interview. For these short stories, participants were asked: “What was your lived experience of participation in politics as a young woman? Can you tell me any stories or anecdotes that you think exemplify or typify your experience?”. The questions were accompanied by a list of guidelines for writing descriptive vignettes. And the analytical method was PTA. Among the provisional results, we found preliminary emerging themes, which could in the advance of the investigation result in meaning structures of political participation of young women. They are the following: - Complicity may be inherent/essential in political participation as a young woman; - Feelings may be essential/inherent in political participation as a young woman; - Hope may be essential in authentic political participation as a young woman; - Frustration may be essential in authentic political participation as a young woman; - Satisfaction may be essential in authentic political participation as a young woman; - There may be tension between individual/collective inherent/essential in political participation as a young woman; - Political participation as a young woman may include moments of public demonstration.Keywords: applied hermeneutic phenomenology, hermeneutics, phenomenology, political participation
Procedia PDF Downloads 9934917 A Machine Learning Approach for the Leakage Classification in the Hydraulic Final Test
Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter
Abstract:
The widespread use of machine learning applications in production is significantly accelerated by improved computing power and increasing data availability. Predictive quality enables the assurance of product quality by using machine learning models as a basis for decisions on test results. The use of real Bosch production data based on geometric gauge blocks from machining, mating data from assembly and hydraulic measurement data from final testing of directional valves is a promising approach to classifying the quality characteristics of workpieces.Keywords: machine learning, classification, predictive quality, hydraulics, supervised learning
Procedia PDF Downloads 21334916 Analysis of Cyber Activities of Potential Business Customers Using Neo4j Graph Databases
Authors: Suglo Tohari Luri
Abstract:
Data analysis is an important aspect of business performance. With the application of artificial intelligence within databases, selecting a suitable database engine for an application design is also very crucial for business data analysis. The application of business intelligence (BI) software into some relational databases such as Neo4j has proved highly effective in terms of customer data analysis. Yet what remains of great concern is the fact that not all business organizations have the neo4j business intelligence software applications to implement for customer data analysis. Further, those with the BI software lack personnel with the requisite expertise to use it effectively with the neo4j database. The purpose of this research is to demonstrate how the Neo4j program code alone can be applied for the analysis of e-commerce website customer visits. As the neo4j database engine is optimized for handling and managing data relationships with the capability of building high performance and scalable systems to handle connected data nodes, it will ensure that business owners who advertise their products at websites using neo4j as a database are able to determine the number of visitors so as to know which products are visited at routine intervals for the necessary decision making. It will also help in knowing the best customer segments in relation to specific goods so as to place more emphasis on their advertisement on the said websites.Keywords: data, engine, intelligence, customer, neo4j, database
Procedia PDF Downloads 19334915 Decision Making System for Clinical Datasets
Authors: P. Bharathiraja
Abstract:
Computer Aided decision making system is used to enhance diagnosis and prognosis of diseases and also to assist clinicians and junior doctors in clinical decision making. Medical Data used for decision making should be definite and consistent. Data Mining and soft computing techniques are used for cleaning the data and for incorporating human reasoning in decision making systems. Fuzzy rule based inference technique can be used for classification in order to incorporate human reasoning in the decision making process. In this work, missing values are imputed using the mean or mode of the attribute. The data are normalized using min-ma normalization to improve the design and efficiency of the fuzzy inference system. The fuzzy inference system is used to handle the uncertainties that exist in the medical data. Equal-width-partitioning is used to partition the attribute values into appropriate fuzzy intervals. Fuzzy rules are generated using Class Based Associative rule mining algorithm. The system is trained and tested using heart disease data set from the University of California at Irvine (UCI) Machine Learning Repository. The data was split using a hold out approach into training and testing data. From the experimental results it can be inferred that classification using fuzzy inference system performs better than trivial IF-THEN rule based classification approaches. Furthermore it is observed that the use of fuzzy logic and fuzzy inference mechanism handles uncertainty and also resembles human decision making. The system can be used in the absence of a clinical expert to assist junior doctors and clinicians in clinical decision making.Keywords: decision making, data mining, normalization, fuzzy rule, classification
Procedia PDF Downloads 51734914 Online Topic Model for Broadcasting Contents Using Semantic Correlation Information
Authors: Chang-Uk Kwak, Sun-Joong Kim, Seong-Bae Park, Sang-Jo Lee
Abstract:
This paper proposes a method of learning topics for broadcasting contents. There are two kinds of texts related to broadcasting contents. One is a broadcasting script which is a series of texts including directions and dialogues. The other is blogposts which possesses relatively abstracted contents, stories and diverse information of broadcasting contents. Although two texts range over similar broadcasting contents, words in blogposts and broadcasting script are different. In order to improve the quality of topics, it needs a method to consider the word difference. In this paper, we introduce a semantic vocabulary expansion method to solve the word difference. We expand topics of the broadcasting script by incorporating the words in blogposts. Each word in blogposts is added to the most semantically correlated topics. We use word2vec to get the semantic correlation between words in blogposts and topics of scripts. The vocabularies of topics are updated and then posterior inference is performed to rearrange the topics. In experiments, we verified that the proposed method can learn more salient topics for broadcasting contents.Keywords: broadcasting script analysis, topic expansion, semantic correlation analysis, word2vec
Procedia PDF Downloads 25134913 Detecting Potential Geothermal Sites by Using Well Logging, Geophysical and Remote Sensing Data at Siwa Oasis, Western Desert, Egypt
Authors: Amr S. Fahil, Eman Ghoneim
Abstract:
Egypt made significant efforts during the past few years to discover significant renewable energy sources. Regions in Egypt that have been identified for geothermal potential investigation include the Gulf of Suez and the Western Desert. One of the most promising sites for the development of Egypt's Northern Western Desert is Siwa Oasis. The geological setting of the oasis, a tectonically generated depression situated in the northernmost region of the Western desert, supports the potential for substantial geothermal resources. Field data obtained from 27 deep oil wells along the Western Desert included bottom-hole temperature (BHT) depth to basement measurements, and geological maps; data were utilized in this study. The major lithological units, elevation, surface gradient, lineaments density, and remote sensing multispectral and topographic were mapped together to generate the related physiographic variables. Eleven thematic layers were integrated in a geographic information system (GIS) to create geothermal maps to aid in the detection of significant potential geothermal spots along the Siwa Oasis and its vicinity. The contribution of total magnetic intensity data with reduction to the pole (RTP) to the first investigation of the geothermal potential in Siwa Oasis is applied in this work. The integration of geospatial data with magnetic field measurements showed a clear correlation between areas of high heat flow and magnetic anomalies. Such anomalies can be interpreted as related to the existence of high geothermal energy and dense rock, which also have high magnetic susceptibility. The outcomes indicated that the study area has a geothermal gradient ranging from 18 to 42 °C/km, a heat flow ranging from 24.7 to 111.3 m.W. k−1, a thermal conductivity of 1.3–2.65 W.m−1.k−1 and a measured amplitude temperature maximum of 100.7 °C. The southeastern part of the Siwa Oasis, and some sporadic locations on the eastern section of the oasis were found to have significant geothermal potential; consequently, this location is suitable for future geothermal investigation. The adopted method might be applied to identify significant prospective geothermal energy locations in other regions of Egypt and East Africa.Keywords: magnetic data, SRTM, depth to basement, remote sensing, GIS, geothermal gradient, heat flow, thermal conductivity
Procedia PDF Downloads 11734912 Forecasting Unemployment Rate in Selected European Countries Using Smoothing Methods
Authors: Ksenija Dumičić, Anita Čeh Časni, Berislav Žmuk
Abstract:
The aim of this paper is to select the most accurate forecasting method for predicting the future values of the unemployment rate in selected European countries. In order to do so, several forecasting techniques adequate for forecasting time series with trend component, were selected, namely: double exponential smoothing (also known as Holt`s method) and Holt-Winters` method which accounts for trend and seasonality. The results of the empirical analysis showed that the optimal model for forecasting unemployment rate in Greece was Holt-Winters` additive method. In the case of Spain, according to MAPE, the optimal model was double exponential smoothing model. Furthermore, for Croatia and Italy the best forecasting model for unemployment rate was Holt-Winters` multiplicative model, whereas in the case of Portugal the best model to forecast unemployment rate was Double exponential smoothing model. Our findings are in line with European Commission unemployment rate estimates.Keywords: European Union countries, exponential smoothing methods, forecast accuracy unemployment rate
Procedia PDF Downloads 36934911 Reclamation of Molding Sand: A Chemical Approach to Recycle Waste Foundry Sand
Authors: Mohd Moiz Khan, S. M. Mahajani, G. N. Jadhav
Abstract:
Waste foundry sand (total clay content 15%) contains toxic heavy metals and particulate matter which make dumping of waste sand an environmental and health hazard. Disposal of waste foundry sand (WFS) remains one of the substantial challenges faced by Indian foundries nowadays. To cope up with this issue, the chemical method was used to reclaim WFS. A stirrer tank reactor was used for chemical reclamation. Experiments were performed to reduce the total clay content from 15% to as low as 0.9% in chemical reclamation. This method, although found to be effective for WFS reclamation, it may face a challenge due to the possibly high operating cost. Reclaimed sand was found to be satisfactory in terms of sand qualities such as total clay (0.9%), active clay (0.3%), acid demand value (ADV) (2.6%), loss on igniting (LOI) (3 %), grain fineness number (GFN) (56), and compressive strength (60 kPa). The experimental data generated on chemical reactor under different conditions is further used to optimize the design and operating parameters (rotation speed, sand to acidic solution ratio, acid concentration, temperature and time) for the best performance. The use of reclaimed sand within the foundry would improve the economics and efficiency of the process and reduce environmental concerns.Keywords: chemical reclamation, clay content, environmental concerns, recycle, waste foundry sand
Procedia PDF Downloads 14734910 Stochastic Multicast Routing Protocol for Flying Ad-Hoc Networks
Authors: Hyunsun Lee, Yi Zhu
Abstract:
Wireless ad-hoc network is a decentralized type of temporary machine-to-machine connection that is spontaneous or impromptu so that it does not rely on any fixed infrastructure and centralized administration. As unmanned aerial vehicles (UAVs), also called drones, have recently become more accessible and widely utilized in military and civilian domains such as surveillance, search and detection missions, traffic monitoring, remote filming, product delivery, to name a few. The communication between these UAVs become possible and materialized through Flying Ad-hoc Networks (FANETs). However, due to the high mobility of UAVs that may cause different types of transmission interference, it is vital to design robust routing protocols for FANETs. In this talk, the multicast routing method based on a modified stochastic branching process is proposed. The stochastic branching process is often used to describe an early stage of an infectious disease outbreak, and the reproductive number in the process is used to classify the outbreak into a major or minor outbreak. The reproductive number to regulate the local transmission rate is adapted and modified for flying ad-hoc network communication. The performance of the proposed routing method is compared with other well-known methods such as flooding method and gossip method based on three measures; average reachability, average node usage and average branching factor. The proposed routing method achieves average reachability very closer to flooding method, average node usage closer to gossip method, and outstanding average branching factor among methods. It can be concluded that the proposed multicast routing scheme is more efficient than well-known routing schemes such as flooding and gossip while it maintains high performance.Keywords: Flying Ad-hoc Networks, Multicast Routing, Stochastic Branching Process, Unmanned Aerial Vehicles
Procedia PDF Downloads 12334909 Studying on Pile Seismic Operation with Numerical Method by Using FLAC 3D Software
Authors: Hossein Motaghedi, Kaveh Arkani, Siavash Salamatpoor
Abstract:
Usually the piles are important tools for safety and economical design of high and heavy structures. For this aim the response of single pile under dynamic load is so effective. Also, the agents which have influence on single pile response are properties of pile geometrical, soil and subjected loads. In this study the finite difference numerical method and by using FLAC 3D software is used for evaluation of single pile behavior under peak ground acceleration (PGA) of El Centro earthquake record in California (1940). The results of this models compared by experimental results of other researchers and it will be seen that the results of this models are approximately coincide by experimental data's. For example the maximum moment and displacement in top of the pile is corresponding to the other experimental results of pervious researchers. Furthermore, in this paper is tried to evaluate the effective properties between soil and pile. The results is shown that by increasing the pile diagonal, the pile top displacement will be decreased. As well as, by increasing the length of pile, the top displacement will be increased. Also, by increasing the stiffness ratio of pile to soil, the produced moment in pile body will be increased and the taller piles have more interaction by soils and have high inertia. So, these results can help directly to optimization design of pile dimensions.Keywords: pile seismic response, interaction between soil and pile, numerical analysis, FLAC 3D
Procedia PDF Downloads 38934908 Dynamic Response around Inclusions in Infinitely Inhomogeneous Media
Authors: Jinlai Bian, Zailin Yang, Guanxixi Jiang, Xinzhu Li
Abstract:
The problem of elastic wave propagation in inhomogeneous medium has always been a classic problem. Due to the frequent occurrence of earthquakes, many economic losses and casualties have been caused, therefore, to prevent earthquake damage to people and reduce damage, this paper studies the dynamic response around the circular inclusion in the whole space with inhomogeneous modulus, the inhomogeneity of the medium is reflected in the shear modulus of the medium with the spatial position, and the density is constant, this method can be used to solve the problem of the underground buried pipeline. Stress concentration phenomena are common in aerospace and earthquake engineering, and the dynamic stress concentration factor (DSCF) is one of the main factors leading to material damage, one of the important applications of the theory of elastic dynamics is to determine the stress concentration in the body with discontinuities such as cracks, holes, and inclusions. At present, the methods include wave function expansion method, integral transformation method, integral equation method and so on. Based on the complex function method, the Helmholtz equation with variable coefficients is standardized by using conformal transformation method and wave function expansion method, the displacement and stress fields in the whole space with circular inclusions are solved in the complex coordinate system, the unknown coefficients are solved by using boundary conditions, by comparing with the existing results, the correctness of this method is verified, based on the superiority of the complex variable function theory to the conformal transformation, this method can be extended to study the inclusion problem of arbitrary shapes. By solving the dynamic stress concentration factor around the inclusions, the influence of the inhomogeneous parameters of the medium and the wavenumber ratio of the inclusions to the matrix on the dynamic stress concentration factor is analyzed. The research results can provide some reference value for the evaluation of nondestructive testing (NDT), oil exploration, seismic monitoring, and soil-structure interaction.Keywords: circular inclusions, complex variable function, dynamic stress concentration factor (DSCF), inhomogeneous medium
Procedia PDF Downloads 13534907 Validation of Visibility Data from Road Weather Information Systems by Comparing Three Data Resources: Case Study in Ohio
Authors: Fan Ye
Abstract:
Adverse weather conditions, particularly those with low visibility, are critical to the driving tasks. However, the direct relationship between visibility distances and traffic flow/roadway safety is uncertain due to the limitation of visibility data availability. The recent growth of deployment of Road Weather Information Systems (RWIS) makes segment-specific visibility information available which can be integrated with other Intelligent Transportation System, such as automated warning system and variable speed limit, to improve mobility and safety. Before applying the RWIS visibility measurements in traffic study and operations, it is critical to validate the data. Therefore, an attempt was made in the paper to examine the validity and viability of RWIS visibility data by comparing visibility measurements among RWIS, airport weather stations, and weather information recorded by police in crash reports, based on Ohio data. The results indicated that RWIS visibility measurements were significantly different from airport visibility data in Ohio, but no conclusion regarding the reliability of RWIS visibility could be drawn in the consideration of no verified ground truth in the comparisons. It was suggested that more objective methods are needed to validate the RWIS visibility measurements, such as continuous in-field measurements associated with various weather events using calibrated visibility sensors.Keywords: RWIS, visibility distance, low visibility, adverse weather
Procedia PDF Downloads 25134906 Design and Simulation of All Optical Fiber to the Home Network
Authors: Rahul Malhotra
Abstract:
Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.Keywords: BER, PON, TDMPON, GPON, CWDM, OLT, ONT
Procedia PDF Downloads 55634905 Investigating Elastica and Post Buckling Behavior Columns Using the Modified Newmark Method
Authors: Seyed Amin Vakili, Sahar Sadat Vakili, Seyed Ehsan Vakili, Nader Abdoli Yazdi
Abstract:
The purpose of this article is to analyze the finite displacement of Columns by applying the Modified Newmark Method. This research will be performed on Columns subjected to compressive axial load, therefore the non-linearity of the geometry is also considered. If the considered strut is perfect, the governing differential equation contains a branching point in the solution path. Investigation into the Elastica is a part of generalizing the developed method. It presents the ability of the Modified Newmark Method in treating non-linear differential equations Derived from elastic strut stability problems. These include not only an approximate polynomial solution for the Elastica problems, but can also recognize the branching point and the stable solution. However, this investigation deals with the post-buckling response of elastic and pin ended columns subjected to central or equally eccentric axial loads.Keywords: columns, structural modeling, structures & structural stability, loads
Procedia PDF Downloads 31434904 Genetic Variation among the Wild and Hatchery Raised Populations of Labeo rohita Revealed by RAPD Markers
Authors: Fayyaz Rasool, Shakeela Parveen
Abstract:
The studies on genetic diversity of Labeo rohita by using molecular markers were carried out to investigate the genetic structure by RAPAD marker and the levels of polymorphism and similarity amongst the different groups of five populations of wild and farmed types. The samples were collected from different five locations as representatives of wild and hatchery raised populations. RAPAD data for Jaccard’s coefficient by following the un-weighted Pair Group Method with Arithmetic Mean (UPGMA) for Hierarchical Clustering of the similar groups on the basis of similarity amongst the genotypes and the dendrogram generated divided the randomly selected individuals of the five populations into three classes/clusters. The variance decomposition for the optimal classification values remained as 52.11% for within class variation, while 47.89% for the between class differences. The Principal Component Analysis (PCA) for grouping of the different genotypes from the different environmental conditions was done by Spearman Varimax rotation method for bi-plot generation of the co-occurrence of the same genotypes with similar genetic properties and specificity of different primers indicated clearly that the increase in the number of factors or components was correlated with the decrease in eigenvalues. The Kaiser Criterion based upon the eigenvalues greater than one, first two main factors accounted for 58.177% of cumulative variability.Keywords: variation, clustering, PCA, wild, hatchery, RAPAD, Labeo rohita
Procedia PDF Downloads 44934903 Quintic Spline Solution of Fourth-Order Parabolic Equations Arising in Beam Theory
Authors: Reza Mohammadi, Mahdieh Sahebi
Abstract:
We develop a method based on polynomial quintic spline for numerical solution of fourth-order non-homogeneous parabolic partial differential equation with variable coefficient. By using polynomial quintic spline in off-step points in space and finite difference in time directions, we obtained two three level implicit methods. Stability analysis of the presented method has been carried out. We solve four test problems numerically to validate the derived method. Numerical comparison with other methods shows the superiority of presented scheme.Keywords: fourth-order parabolic equation, variable coefficient, polynomial quintic spline, off-step points
Procedia PDF Downloads 35234902 Comparative Performance of Retting Methods on Quality Jute Fibre Production and Water Pollution for Environmental Safety
Authors: A. K. M. Zakir Hossain, Faruk-Ul Islam, Muhammad Alamgir Chowdhury, Kazi Morshed Alam, Md. Rashidul Islam, Muhammad Humayun Kabir, Noshin Ara Tunazzina, Taufiqur Rahman, Md. Ashik Mia, Ashaduzzaman Sagar
Abstract:
The jute retting process is one of the key factors for the excellent jute fibre production as well as maintaining water quality. The traditional method of jute retting is time-consuming and hampers the fish cultivation by polluting the water body. Therefore, a low cost, time-saving, environment-friendly, and improved technique is essential for jute retting to overcome this problem. Thus the study was focused to compare the extent of water pollution and fibre quality of two retting systems, i.e., traditional retting practices over-improved retting method (macha retting) by assessing different physico-chemical and microbiological properties of water and fibre quality parameters. Water samples were collected from the top and bottom of the retting place at the early, mid, and final stages of retting from four districts of Bangladesh viz., Gaibandha, Kurigram, Lalmonirhat, and Rangpur. Different physico-chemical parameters of water samples viz., pH, dissolved oxygen (DO), conductivity (CD), total dissolved solids (TDS), hardness, calcium, magnesium, carbonate, bicarbonate, chloride, phosphorus and sulphur content were measured. Irrespective of locations, the DO of the final stage retting water samples was very low as compared to the mid and early stage, and the DO of traditional jute retting method was significantly lower than the improved macha method. The pH of the water samples was slightly more acidic in the traditional retting method than that of the improved macha method. Other physico-chemical parameters of the water sample were found higher in the traditional method over-improved macha retting in all the stages of retting. Bacterial species were isolated from the collected water samples following the dilution plate technique. Microbiological results revealed that water samples of improved macha method contained more bacterial species that are supposed to involve in jute retting as compared to water samples of the traditional retting method. The bacterial species were then identified by the sequencing of 16SrDNA. Most of the bacterial species identified belong to the genera Pseudomonas, Bacillus, Pectobacterium, and Stenotrophomonas. In addition, the tensile strength of the jute fibre was tested, and the results revealed that the improved macha method showed higher mechanical strength than the traditional method in most of the locations. The overall results indicate that the water and fibre quality were found better in the improved macha retting method than the traditional method. Therefore, a time-saving and cost-friendly improved macha retting method can be widely adopted for the jute retting process to get the quality jute fiber and to keep the environment clean and safe.Keywords: jute retting methods, physico-chemical parameters, retting microbes, tensile strength, water quality
Procedia PDF Downloads 15834901 Generative Adversarial Network for Bidirectional Mappings between Retinal Fundus Images and Vessel Segmented Images
Authors: Haoqi Gao, Koichi Ogawara
Abstract:
Retinal vascular segmentation of color fundus is the basis of ophthalmic computer-aided diagnosis and large-scale disease screening systems. Early screening of fundus diseases has great value for clinical medical diagnosis. The traditional methods depend on the experience of the doctor, which is time-consuming, labor-intensive, and inefficient. Furthermore, medical images are scarce and fraught with legal concerns regarding patient privacy. In this paper, we propose a new Generative Adversarial Network based on CycleGAN for retinal fundus images. This method can generate not only synthetic fundus images but also generate corresponding segmentation masks, which has certain application value and challenge in computer vision and computer graphics. In the results, we evaluate our proposed method from both quantitative and qualitative. For generated segmented images, our method achieves dice coefficient of 0.81 and PR of 0.89 on DRIVE dataset. For generated synthetic fundus images, we use ”Toy Experiment” to verify the state-of-the-art performance of our method.Keywords: retinal vascular segmentations, generative ad-versarial network, cyclegan, fundus images
Procedia PDF Downloads 14434900 Study of Stability of a Slope by the Soil Nailed Technique
Authors: Abdelhak Soudani
Abstract:
Using the limit equilibrium method in geotechnical field is very important for large projects. This work contributes to the understanding and analysis of the building unstable slopes by the technique of soil nailed with the used of software called GEO-SLOPE calculation based on limit equilibrium method. To achieve our objective, we began a review of the literature on landslides, and techniques of slope stability. Then, we presented a real case slope likely to slip through the realization of the EastWest Highway (M5 stretch between Khemis Miliana and Hoceinia). We also process the application of reinforcement technique nailed soil. The analysis is followed by a parametric study, which shows the impact of parameters given or chosen on various outcomes. Another method of reinforcement (use of micro-piles) has been suggested for improving the stability of the slopeKeywords: slope stability, strengthening, slip, soil nail, GEO-SLOPE
Procedia PDF Downloads 46634899 A New Approach for Improving Accuracy of Multi Label Stream Data
Authors: Kunal Shah, Swati Patel
Abstract:
Many real world problems involve data which can be considered as multi-label data streams. Efficient methods exist for multi-label classification in non streaming scenarios. However, learning in evolving streaming scenarios is more challenging, as the learners must be able to adapt to change using limited time and memory. Classification is used to predict class of unseen instance as accurate as possible. Multi label classification is a variant of single label classification where set of labels associated with single instance. Multi label classification is used by modern applications, such as text classification, functional genomics, image classification, music categorization etc. This paper introduces the task of multi-label classification, methods for multi-label classification and evolution measure for multi-label classification. Also, comparative analysis of multi label classification methods on the basis of theoretical study, and then on the basis of simulation was done on various data sets.Keywords: binary relevance, concept drift, data stream mining, MLSC, multiple window with buffer
Procedia PDF Downloads 58434898 Modeling of Large Elasto-Plastic Deformations by the Coupled FE-EFGM
Authors: Azher Jameel, Ghulam Ashraf Harmain
Abstract:
In the recent years, the enriched techniques like the extended finite element method, the element free Galerkin method, and the Coupled finite element-element free Galerkin method have found wide application in modeling different types of discontinuities produced by cracks, contact surfaces, and bi-material interfaces. The extended finite element method faces severe mesh distortion issues while modeling large deformation problems. The element free Galerkin method does not have mesh distortion issues, but it is computationally more demanding than the finite element method. The coupled FE-EFGM proves to be an efficient numerical tool for modeling large deformation problems as it exploits the advantages of both FEM and EFGM. The present paper employs the coupled FE-EFGM to model large elastoplastic deformations in bi-material engineering components. The large deformation occurring in the domain has been modeled by using the total Lagrangian approach. The non-linear elastoplastic behavior of the material has been represented by the Ramberg-Osgood model. The elastic predictor-plastic corrector algorithms are used for the evaluation stresses during large deformation. Finally, several numerical problems are solved by the coupled FE-EFGM to illustrate its applicability, efficiency and accuracy in modeling large elastoplastic deformations in bi-material samples. The results obtained by the proposed technique are compared with the results obtained by XFEM and EFGM. A remarkable agreement was observed between the results obtained by the three techniques.Keywords: XFEM, EFGM, coupled FE-EFGM, level sets, large deformation
Procedia PDF Downloads 44734897 Secure Cryptographic Operations on SIM Card for Mobile Financial Services
Authors: Kerem Ok, Serafettin Senturk, Serdar Aktas, Cem Cevikbas
Abstract:
Mobile technology is very popular nowadays and it provides a digital world where users can experience many value-added services. Service Providers are also eager to offer diverse value-added services to users such as digital identity, mobile financial services and so on. In this context, the security of data storage in smartphones and the security of communication between the smartphone and service provider are critical for the success of these services. In order to provide the required security functions, the SIM card is one acceptable alternative. Since SIM cards include a Secure Element, they are able to store sensitive data, create cryptographically secure keys, encrypt and decrypt data. In this paper, we design and implement a SIM and a smartphone framework that uses a SIM card for secure key generation, key storage, data encryption, data decryption and digital signing for mobile financial services. Our frameworks show that the SIM card can be used as a controlled Secure Element to provide required security functions for popular e-services such as mobile financial services.Keywords: SIM card, mobile financial services, cryptography, secure data storage
Procedia PDF Downloads 31234896 Synthetic Data-Driven Prediction Using GANs and LSTMs for Smart Traffic Management
Authors: Srinivas Peri, Siva Abhishek Sirivella, Tejaswini Kallakuri, Uzair Ahmad
Abstract:
Smart cities and intelligent transportation systems rely heavily on effective traffic management and infrastructure planning. This research tackles the data scarcity challenge by generating realistically synthetic traffic data from the PeMS-Bay dataset, enhancing predictive modeling accuracy and reliability. Advanced techniques like TimeGAN and GaussianCopula are utilized to create synthetic data that mimics the statistical and structural characteristics of real-world traffic. The future integration of Spatial-Temporal Generative Adversarial Networks (ST-GAN) is anticipated to capture both spatial and temporal correlations, further improving data quality and realism. Each synthetic data generation model's performance is evaluated against real-world data to identify the most effective models for accurately replicating traffic patterns. Long Short-Term Memory (LSTM) networks are employed to model and predict complex temporal dependencies within traffic patterns. This holistic approach aims to identify areas with low vehicle counts, reveal underlying traffic issues, and guide targeted infrastructure interventions. By combining GAN-based synthetic data generation with LSTM-based traffic modeling, this study facilitates data-driven decision-making that improves urban mobility, safety, and the overall efficiency of city planning initiatives.Keywords: GAN, long short-term memory (LSTM), synthetic data generation, traffic management
Procedia PDF Downloads 1434895 An Adaptive Decomposition for the Variability Analysis of Observation Time Series in Geophysics
Authors: Olivier Delage, Thierry Portafaix, Hassan Bencherif, Guillaume Guimbretiere
Abstract:
Most observation data sequences in geophysics can be interpreted as resulting from the interaction of several physical processes at several time and space scales. As a consequence, measurements time series in geophysics have often characteristics of non-linearity and non-stationarity and thereby exhibit strong fluctuations at all time-scales and require a time-frequency representation to analyze their variability. Empirical Mode Decomposition (EMD) is a relatively new technic as part of a more general signal processing method called the Hilbert-Huang transform. This analysis method turns out to be particularly suitable for non-linear and non-stationary signals and consists in decomposing a signal in an auto adaptive way into a sum of oscillating components named IMFs (Intrinsic Mode Functions), and thereby acts as a bank of bandpass filters. The advantages of the EMD technic are to be entirely data driven and to provide the principal variability modes of the dynamics represented by the original time series. However, the main limiting factor is the frequency resolution that may give rise to the mode mixing phenomenon where the spectral contents of some IMFs overlap each other. To overcome this problem, J. Gilles proposed an alternative entitled “Empirical Wavelet Transform” (EWT) which consists in building from the segmentation of the original signal Fourier spectrum, a bank of filters. The method used is based on the idea utilized in the construction of both Littlewood-Paley and Meyer’s wavelets. The heart of the method lies in the segmentation of the Fourier spectrum based on the local maxima detection in order to obtain a set of non-overlapping segments. Because linked to the Fourier spectrum, the frequency resolution provided by EWT is higher than that provided by EMD and therefore allows to overcome the mode-mixing problem. On the other hand, if the EWT technique is able to detect the frequencies involved in the original time series fluctuations, EWT does not allow to associate the detected frequencies to a specific mode of variability as in the EMD technic. Because EMD is closer to the observation of physical phenomena than EWT, we propose here a new technic called EAWD (Empirical Adaptive Wavelet Decomposition) based on the coupling of the EMD and EWT technics by using the IMFs density spectral content to optimize the segmentation of the Fourier spectrum required by EWT. In this study, EMD and EWT technics are described, then EAWD technic is presented. Comparison of results obtained respectively by EMD, EWT and EAWD technics on time series of ozone total columns recorded at Reunion island over [1978-2019] period is discussed. This study was carried out as part of the SOLSTYCE project dedicated to the characterization and modeling of the underlying dynamics of time series issued from complex systems in atmospheric sciencesKeywords: adaptive filtering, empirical mode decomposition, empirical wavelet transform, filter banks, mode-mixing, non-linear and non-stationary time series, wavelet
Procedia PDF Downloads 13734894 Nature of Forest Fragmentation Owing to Human Population along Elevation Gradient in Different Countries in Hindu Kush Himalaya Mountains
Authors: Pulakesh Das, Mukunda Dev Behera, Manchiraju Sri Ramachandra Murthy
Abstract:
Large numbers of people living in and around the Hindu Kush Himalaya (HKH) region, depends on this diverse mountainous region for ecosystem services. Following the global trend, this region also experiencing rapid population growth, and demand for timber and agriculture land. The eight countries sharing the HKH region have different forest resources utilization and conservation policies that exert varying forces in the forest ecosystem. This created a variable spatial as well altitudinal gradient in rate of deforestation and corresponding forest patch fragmentation. The quantitative relationship between fragmentation and demography has not been established before for HKH vis-à-vis along elevation gradient. This current study was carried out to attribute the overall and different nature in landscape fragmentations along the altitudinal gradient with the demography of each sharing countries. We have used the tree canopy cover data derived from Landsat data to analyze the deforestation and afforestation rate, and corresponding landscape fragmentation observed during 2000 – 2010. Area-weighted mean radius of gyration (AMN radius of gyration) was computed owing to its advantage as spatial indicator of fragmentation over non-spatial fragmentation indices. Using the subtraction method, the change in fragmentation was computed during 2000 – 2010. Using the tree canopy cover data as a surrogate of forest cover, highest forest loss was observed in Myanmar followed by China, India, Bangladesh, Nepal, Pakistan, Bhutan, and Afghanistan. However, the sequence of fragmentation was different after the maximum fragmentation observed in Myanmar followed by India, China, Bangladesh, and Bhutan; whereas increase in fragmentation was seen following the sequence of as Nepal, Pakistan, and Afghanistan. Using SRTM-derived DEM, we observed higher rate of fragmentation up to 2400m that corroborated with high human population for the year 2000 and 2010. To derive the nature of fragmentation along the altitudinal gradients, the Statistica software was used, where the user defined function was utilized for regression applying the Gauss-Newton estimation method with 50 iterations. We observed overall logarithmic decrease in fragmentation change (area-weighted mean radius of gyration), forest cover loss and population growth during 2000-2010 along the elevation gradient with very high R2 values (i.e., 0.889, 0.895, 0.944 respectively). The observed negative logarithmic function with the major contribution in the initial elevation gradients suggest to gap filling afforestation in the lower altitudes to enhance the forest patch connectivity. Our finding on the pattern of forest fragmentation and human population across the elevation gradient in HKH region will have policy level implication for different nations and would help in characterizing hotspots of change. Availability of free satellite derived data products on forest cover and DEM, grid-data on demography, and utility of geospatial tools helped in quick evaluation of the forest fragmentation vis-a-vis human impact pattern along the elevation gradient in HKH.Keywords: area-weighted mean radius of gyration, fragmentation, human impact, tree canopy cover
Procedia PDF Downloads 21534893 Multiobjective Optimization of a Pharmaceutical Formulation Using Regression Method
Authors: J. Satya Eswari, Ch. Venkateswarlu
Abstract:
The formulation of a commercial pharmaceutical product involves several composition factors and response characteristics. When the formulation requires to satisfy multiple response characteristics which are conflicting, an optimal solution requires the need for an efficient multiobjective optimization technique. In this work, a regression is combined with a non-dominated sorting differential evolution (NSDE) involving Naïve & Slow and ε constraint techniques to derive different multiobjective optimization strategies, which are then evaluated by means of a trapidil pharmaceutical formulation. The analysis of the results show the effectiveness of the strategy that combines the regression model and NSDE with the integration of both Naïve & Slow and ε constraint techniques for Pareto optimization of trapidil formulation. With this strategy, the optimal formulation at pH=6.8 is obtained with the decision variables of micro crystalline cellulose, hydroxypropyl methylcellulose and compression pressure. The corresponding response characteristics of rate constant and release order are also noted down. The comparison of these results with the experimental data and with those of other multiple regression model based multiobjective evolutionary optimization strategies signify the better performance for optimal trapidil formulation.Keywords: pharmaceutical formulation, multiple regression model, response surface method, radial basis function network, differential evolution, multiobjective optimization
Procedia PDF Downloads 40934892 A Conjugate Gradient Method for Large Scale Unconstrained Optimization
Authors: Mohammed Belloufi, Rachid Benzine, Badreddine Sellami
Abstract:
Conjugate gradient methods is useful for solving large scale optimization problems in scientific and engineering computation, characterized by the simplicity of their iteration and their low memory requirements. It is well known that the search direction plays a main role in the line search method. In this paper, we propose a search direction with the Wolfe line search technique for solving unconstrained optimization problems. Under the above line searches and some assumptions, the global convergence properties of the given methods are discussed. Numerical results and comparisons with other CG methods are given.Keywords: unconstrained optimization, conjugate gradient method, strong Wolfe line search, global convergence
Procedia PDF Downloads 42234891 Unconventional Explorers: Gen Z Travelers Redefinding the Travel Experience
Authors: M. Panidou, F. Kilipiris, E. Christou, K. Alexandris
Abstract:
This study intends to investigate the travel preferences of Generation Z (born between 1996 and 2012), focusing on their inclination towards unique and unconventional travel experiences, prioritization of authentic cultural immersion and local experiences over traditional tourist attractions, and their value for flexibility and spontaneity in travel plans. By examining these aspects, the research aims to provide insights into the preferences and behaviors of Generation Z travelers, contributing to a better understanding of their travel choices and informing the tourism industry in catering to their needs and desires. Secondary data was gathered from academic literature and industry reports to offer a thorough study of the topic. A quantitative method was used, and primary data was collected through an online questionnaire. One hundred Greek people between the ages of eighteen and twenty-seven were the study's sample. SPSS software was used to assist in the analysis of the data. The findings of the research showed that Gen Z is attracted to unusual and distinctive travel experiences, prioritizing genuine cultural immersion over typical tourist attractions, and they highly value flexibility in their travel decision-making. This research contributes to a deeper understanding of how Gen Z travelers are reshaping the travel industry. Travel companies, marketers, and destination management organizations will find the findings useful in adjusting their products to suit this influential demographic's changing demands and preferences. Considering the limitations of the sample size, future studies could expand the sample size to include individuals from different cultural backgrounds for a more comprehensive understanding.Keywords: cultural immersion, flexibility, generation Z, travel preferences, unique experiences
Procedia PDF Downloads 60