J. Palacin

Universitat de Lleida, Lérida, Catalonia, Spain

Are you J. Palacin?

Claim your profile

Publications (55)52.05 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents an automatic method for counting red grapes from high-resolution images of vineyards taken under artificial lighting at night. The proposed method is based on detecting the specular reflection peaks from the spherical surface of the grapes. These intensity peaks are detected by means of a morphological peak detector based on the definition of one central point and several radial points. The morphological condition applied is that the intensity of the central point must be higher than all the radial points. The grape counting results obtained in different occlusion conditions were compared with a manual labeling procedure. On average, the percentage of extremely occluded grapes (occlusion higher than 75%) in the clusters was 33%, whereas the average counting detection error obtained with the automatic method proposed was −14% with only 7% of false positives, confirming that this proposal can count even highly occluded grapes.
    Computers and Electronics in Agriculture 10/2014; 108:105–111. · 1.77 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper proposes the use of an autonomous assistant mobile robot in order to monitor the environmental conditions of a large indoor area and develop an ambient intelligence application. The mobile robot uses single high performance embedded sensors in order to collect and geo-reference environmental information such as ambient temperature, air velocity and orientation and gas concentration. The data collected with the assistant mobile robot is analyzed in order to detect unusual measurements or discrepancies and develop focused corrective ambient actions. This paper shows an example of the measurements performed in a research facility which have enabled the detection and location of an uncomfortable temperature profile inside an office of the research facility. The ambient intelligent application has been developed by performing some localized ambient measurements that have been analyzed in order to propose some ambient actuations to correct the uncomfortable temperature profile.
    Sensors 01/2014; 14(4):6045-6055. · 2.05 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The use of an autonomous mobile robot to locate gas-leaks and air quality monitoring in indoor environments are promising tasks that will avoid risky human operations. However, these are challenging tasks due to the chaotic gas profile propagation originated by uncontrolled air flows. This paper proposes the localization of an acetone gas-leak in a 44 m-length indoor corridor with a mobile robot equipped with a PID sensor. This paper assesses the influence of the mobile robot velocity and the relative height of the PID sensor in the profile of the measurements. The results show weak influence of the robot velocity and strong influence of the relative height of the PID sensor. An estimate of the gas-leak location is also performed by computing the center of mass of the highest gas concentrations.
    Sensor Letters 01/2014; 12. · 0.52 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents an image processing method for in-line automatic and individual nectarine variety verification in a fruit-packing line based on the use of feature histogram vectors obtained by concatenating the histograms computed from different color layers of a circular central area of the skin of the nectarines processed. The verification procedure requires the definition of a small dataset with the feature histogram vectors corresponding to some reference nectarines (manually selected) whose skin clearly identifies the variety being processed. The in-line variety verification of each nectarine processed is then done by computing and comparing its current feature histogram vector with the reference dataset. This paper compares experimentally different alternatives for computing the feature histogram vectors and two methods for feature comparison and variety verification. The experimental validation consists of the automatic in-line processing of nectarine samples from different mixed varieties. The results show an 86% success rate in the case of an expert human operator and 100% when using feature histogram vectors computed in the Rg (red and gray) or YR‾ (luminance and normalized red) intensity color layers and when using correlation to compare the feature vectors.
    Computers and Electronics in Agriculture 01/2014; 102:112–119. · 1.77 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper proposes the development of an automatic fruit harvesting system by combining a low cost stereovision camera and a robotic arm placed in the gripper tool. The stereovision camera is used to estimate the size, distance and position of the fruits whereas the robotic arm is used to mechanically pickup the fruits. The low cost stereovision system has been tested in laboratory conditions with a reference small object, an apple and a pear at 10 different intermediate distances from the camera. The average distance error was from 4% to 5%, and the average diameter error was up to 30% in the case of a small object and in a range from 2% to 6% in the case of a pear and an apple. The stereovision system has been attached to the gripper tool in order to obtain relative distance, orientation and size of the fruit. The harvesting stage requires the initial fruit location, the computation of the inverse kinematics of the robotic arm in order to place the gripper tool in front of the fruit, and a final pickup approach by iteratively adjusting the vertical and horizontal position of the gripper tool in a closed visual loop. The complete system has been tested in controlled laboratory conditions with uniform illumination applied to the fruits. As a future work, this system will be tested and improved in conventional outdoor farming conditions.
    Sensors (Basel, Switzerland). 01/2014; 14(7):11557-11579.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This work proposes the detection of red peaches in orchard images based on the definition of different linear color models in the RGB vector color space. The classification and segmentation of the pixels of the image is then performed by comparing the color distance from each pixel to the different previously defined linear color models. The methodology proposed has been tested with images obtained in a real orchard under natural light. The peach variety in the orchard was the paraguayo (Prunus persica var. platycarpa) peach with red skin. The segmentation results showed that the area of the red peaches in the images was detected with an average error of 11.6%; 19.7% in the case of bright illumination; 8.2% in the case of low illumination; 8.6% for occlusion up to 33%; 12.2% in the case of occlusion between 34 and 66%; and 23% for occlusion above 66%. Finally, a methodology was proposed to estimate the diameter of the fruits based on an ellipsoidal fitting. A first diameter was obtained by using all the contour pixels and a second diameter was obtained by rejecting some pixels of the contour. This approach enables a rough estimate of the fruit occlusion percentage range by comparing the two diameter estimates.
    Sensors 01/2012; 12(6):7701-18. · 2.05 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This work proposes the development of an embedded real-time fruit detection system for future automatic fruit harvesting. The proposed embedded system is based on an ARM Cortex-M4 (STM32F407VGT6) processor and an Omnivision OV7670 color camera. The future goal of this embedded vision system will be to control a robotized arm to automatically select and pick some fruit directly from the tree. The complete embedded system has been designed to be placed directly in the gripper tool of the future robotized harvesting arm. The embedded system will be able to perform real-time fruit detection and tracking by using a three-dimensional look-up-table (LUT) defined in the RGB color space and optimized for fruit picking. Additionally, two different methodologies for creating optimized 3D LUTs based on existing linear color models and fruit histograms were implemented in this work and compared for the case of red peaches. The resulting system is able to acquire general and zoomed orchard images and to update the relative tracking information of a red peach in the tree ten times per second.
    Sensors 01/2012; 12(10):14129-43. · 2.05 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: This work presents a preliminary study on an automatic color based nectarine variety classification system. This system is planned to operate in a fruit packing line in order to contribute to increase the quality of an industrial post-harvesting process by verifying nectarine variety. This proposal is based on the computation of the histogram of the hue, saturation, and value (HSV) skin color description of the nectarines analyzed. The histogram is operated as a fingerprint and used to classify the nectarine varieties by defining a reference database with target varieties. Six different varieties have been analyzed in this preliminary study: one yellowish and five reddish. Results show a classification success of 70% for the yellowish variety whereas two reddish varieties show a very low classification success, 20%. These preliminary results evidence the complexity of the classification problem due to the skin color similarity between varieties and the associated quality production problem as different varieties may differ in taste and/or flesh color.
    Conference Record - IEEE Instrumentation and Measurement Technology Conference 01/2012;
  • [Show abstract] [Hide abstract]
    ABSTRACT: This work presents a preliminary study of pupil detection and tracking with low cost optical flow sensors. In this proposal, the low cost optical flow sensor designed originally for the optical computer mouse has been used in a pupil detection and tracking device. The application was based on the use of this sensor as an imaging combined with a microcontroller. In the future the tracking results will be compared with the automatic measurement performed by the sensor. First tracking results show promising performances and confirm the versatility of this kind of low cost optical flow sensors.
    Conference Record - IEEE Instrumentation and Measurement Technology Conference 01/2012;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents the use of an external fixed two-dimensional laser scanner to detect cylindrical targets attached to moving devices, such as a mobile robot. This proposal is based on the detection of circular markers in the raw data provided by the laser scanner by applying an algorithm for outlier avoidance and a least-squares circular fitting. Some experiments have been developed to empirically validate the proposal with different cylindrical targets in order to estimate the location and tracking errors achieved, which are generally less than 20 mm in the area covered by the laser sensor. As a result of the validation experiments, several error maps have been obtained in order to give an estimate of the uncertainty of any location computed. This proposal has been validated with a medium-sized mobile robot with an attached cylindrical target (diameter 200 mm). The trajectory of the mobile robot was estimated with an average location error of less than 15 mm, and the real location error in each individual circular fitting was similar to the error estimated with the obtained error maps. The radial area covered in this validation experiment was up to 10 m, a value that depends on the radius of the cylindrical target and the radial density of the distance range points provided by the laser scanner but this area can be increased by combining the information of additional external laser scanners.
    Sensors 01/2012; 12(12):16482-97. · 2.05 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper a turtle's hydrofoil 2 DoF's mechanism (degrees of freedom) is implemented and tested in a water channel as a propulsion system for an Autonomous Underwater Vehicle (AUV). The experiments carried out showed an optimal empirical value of the angle of attack for the turtle's hydrofoil that is compared with the theoretical value. The hydrofoil path used in the tests was a linear displacement and the optimization of this path will be analyzed in future works.
    OCEANS, 2011 IEEE - Spain; 07/2011
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This work describes the analysis of different walking paths registered using a Light Detection And Ranging (LIDAR) laser range sensor in order to measure oscillating trajectories during unsupervised walking. The estimate of the gait and trajectory parameters were obtained with a terrestrial LIDAR placed 100 mm above the ground with the scanning plane parallel to the floor to measure the trajectory of the legs without attaching any markers or modifying the floor. Three different large walking experiments were performed to test the proposed measurement system with straight and oscillating trajectories. The main advantages of the proposed system are the possibility to measure several steps and obtain average gait parameters and the minimum infrastructure required. This measurement system enables the development of new ambulatory applications based on the analysis of the gait and the trajectory during a walk.
    Sensors 01/2011; 11(5):5071-86. · 2.05 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents the design and implementation of a turtle hydrofoil for an Autonomous Underwater Vehicle (AUV). The final design of the AUV must have navigation performance like a turtle, which has also been the biomimetic inspiration for the design of the hydrofoil and propulsion system. The hydrofoil design is based on a National Advisory Committee for Aeronautics (NACA) 0014 hydrodynamic profile. During the design stage, four different propulsion systems were compared in terms of propulsion path, compactness, sealing and required power. The final implementation is based on a ball-and-socket mechanism because it is very compact and provides three degrees of freedom (DoF) to the hydrofoil with very few restrictions on the propulsion path. The propulsion obtained with the final implementation of the hydrofoil has been empirically evaluated in a water channel comparing different motion strategies. The results obtained have confirmed that the proposed turtle hydrofoil controlled with a mechanism with three DoF generates can be used in the future implementation of the planned AUV.
    Sensors 01/2011; 11(12):11168-87. · 2.05 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, a low cost optical flow sensor is combined with an external laser device to measure surface displacements and mechanical oscillations. The measurement system is based on applying coherent light to a diffuser surface and using an optical flow sensor to analyze the reflected and transferred light to estimate the displacement of the surface or the laser spot. This work is focused on the characterization of this measurement system, which can have the optical flow sensor placed at different angles and distances from the diffuser surface. The results have shown that the displacement of the diffuser surface is badly estimated when the optical mouse sensor is placed in front of the diffuser surface (angular orientation >150°) while the highest sensitivity is obtained when the sensor is located behind the diffuser surface and on the axis of the laser source (angular orientation 0°). In this case, the coefficient of determination of the measured displacement, R(2), was very high (>0.99) with a relative error of less than 1.29%. Increasing the distance between the surface and the sensor also increased the sensitivity which increases linearly, R(2) = 0.99. Finally, this measurement setup was proposed to measure very low frequency mechanical oscillations applied to the laser device, up to 0.01 Hz in this work. The results have shown that increasing the distance between the surface and the optical flow sensor also increases the sensitivity and the measurement range.
    Sensors 01/2011; 11(12):11856-70. · 2.05 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The use of terrestrial LIDARs in agriculture enables the measurement of structural parameters of the orchards such as the volume of the trees. The sequence of two-dimensional scans performed with a LIDAR attached to a tractor can be interpreted as the three-dimensional silhouette of the trees of the grove and used to estimate their volume. In this work, the sensitivity of the tree volume estimates relative to different error sources in the estimated spatial trajectory of the LIDAR is analyzed. Tests with pear trees have demonstrated that the estimation of the volume is very sensitive to errors in the determination of the distance from the LIDAR to the center of the trees (with errors up to 30% for an error of 50mm) and in the determination of the angle of orientation of the LIDAR (with errors up to 30% for misalignments of 2°). Therefore, any experimental procedure for tree volume estimate based on a motorized terrestrial LIDAR scanner must include additional devices or procedures to control or estimate and correct these error sources.
    Agricultural and Forest Meteorology 10/2010; 150(11):1420-1427. · 3.89 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, floor-cleaning coverage performances of some domestic mobile robots are measured, analyzed and modeled. Results obtained in a reduced scenario show that floor-cleaning coverage is complete in all cases if the path-planning exploration algorithm has some random dependence. Additionally, the evolution of the area cleaned by the mobile robot expressed in a distance domain has an exponential shape that can be modeled with a single exponential where the amplitude defines the maximum cleaning-coverage achieved and the time-constant defines the dynamic evolution of the coverage. Both parameters are robot dependent and can be estimated if the area of the room is known and then floor-cleaning coverage can be predicted and over-cleaning minimized.
    Robotics and Autonomous Systems. 01/2010; 58:37-45.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, the inexpensive optical sensor of a common computer mouse is proposed to measure yarn diameter. This measurement allows a progressive control of the yarn production processes in the textile industry. The optical mouse sensor is an inexpensive sensor that includes all parts of a complete vision system compacted together in a reduced size: illumination, lens, and CMOS optical sensor. The selected CMOS sensor has a two dimensional array of 30×30 pixels that can be used to obtain the diameter at different positions of the yarn with a resolution of 60 μm. Additionally, the sensor can work at very high frame rates allowing the development of real-time monitoring applications.
    Procedia Engineering. 01/2010; 5:236-239.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This work proposes the creation of a bioinspired electronic white cane for blind people using the whiskers principle for short-range navigation and exploration. Whiskers are coarse hairs of an animal's face that tells the animal that it has touched something using the nerves of the skin. In this work the raw data acquired from a low-size terrestrial LIDAR and a tri-axial accelerometer is converted into tactile information using several electromagnetic devices configured as a tactile belt. The LIDAR and the accelerometer are attached to the user's forearm and connected with a wire to the control unit placed on the belt. Early validation experiments carried out in the laboratory are promising in terms of usability and description of the environment.
    Sensors 01/2010; 10(12):11322-39. · 2.05 Impact Factor
  • RASI. 01/2010; 7:39-44.
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, the optical mouse sensor is proposed as the main sensor of an absolute rotary encoder. The image acquired by the optical sensor is used to read a positioning binary code printed on an internal rotary surface to obtain the absolute rotary position. The two-dimensional image matrix acquisition is more robust than the single reading point approach used in other absolute rotary encoders (with only one pixel) where the position must be coded using random sequence codes. Additionally, the low cost of the optical sensor facilitates the development of inexpensive absolute rotary encoders.
    Sensors and Actuators A: Physical. 01/2010;