ArticlePDF Available

Abstract

A challenge of adding a new feature to the optical properties of a camera is solved in this paper. Image processing gives the possibility to enhance the quality, resolution and details of images fast these days. Within this paper, the introduced algorithm used to determine the vision angles (horizontal and vertical) of a digital camera (Nikon). The quality and efficiency of the camera are evaluated using determine the spatial resolution of the captured images. A scale factor algorithm is used also in this paper. A mathematical model introduced to describe the camera angle vision. This model works with different zoom degrees and different camera-object distances. Thus, a general mathematical equation obtained which describes the real image. The comparisons between the real and resulted images give a strong matching with small error rate. Keywords: digital camera algorithm, scale factor, spatial resolution, field of view.
IOP Conference Series: Materials Science and Engineering
PAPER • OPEN ACCESS
Modelling vision angles of optical camera zoom using image processing
algorithm
To cite this article: Heba Kh. Abbas et al 2019 IOP Conf. Ser.: Mater. Sci. Eng. 571 012117
View the article online for updates and enhancements.
This content was downloaded from IP address 45.40.121.90 on 09/08/2019 at 13:44
Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution
of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.
Published under licence by IOP Publishing Ltd
ICCEPS
IOP Conf. Series: Materials Science and Engineering 571 (2019) 012117
IOP Publishing
doi:10.1088/1757-899X/571/1/012117
1
Modelling vision angles of optical camera zoom using image processing
algorithm
Heba Kh. Abbas 1, Haidar J. Mohamad 2, Anwar H. Al-Saleh 3 and Ali A. Al-Zuky 2
1 Department of Physics, College of Science for Women, University of Baghdad, Baghdad, Iraq
2 Department of Physics, College of Science, Mustansiriyah Universitym, Baghdad, Iraq
3 Department of Computer Science, College of Science, Mustansiriyah University, Baghdad, Iraq
physics_heba@yahoo.com
haidar.mohamad@uomustansiriyah.edu.iq
anwar.h.m@ uomustansiriyah.edu.iq
prof.alialzuky@ uomustansiriyah.edu.iq
Abstract
A challenge of adding a new feature to the optical properties of a camera is solved in this paper. Image processing
gives the possibility to enhance the quality, resolution and details of images fast these days. Within this paper, the
introduced algorithm used to determine the vision angles (horizontal and vertical) of a digital camera (Nikon). The
quality and efficiency of the camera are evaluated using determine the spatial resolution of the captured images. A
scale factor algorithm is used also in this paper. A mathematical model introduced to describe the camera angle
vision. This model works with different zoom degrees and different camera-object distances. Thus, a general
mathematical equation obtained which describes the real image. The comparisons between the real and resulted
images give a strong matching with small error rate.
Keywords: digital camera algorithm, scale factor, spatial resolution, field of view.
1. INTRODUCTION
In the old version of cameras, images are captured using chemicals on a tape then printed on photographic
paper. Which means the process is a chemical reaction[1]. The presence of an electronic processor inside the camera
can perform a lot of operations on the captured image, such as editing and deletion, and enables the recording of
short video [2, 3].
A lot of studies regarding camera field of view are performed because it considered an important parameter in
object detection [4,5] and field of view of satellite remote sensing instrument especially for instruments in space is
changed where the comparison of dimensions revealed small differences [6]. The previous studies did not test the
quality of cameras in terms of the capacity vertical and horizontal vision angles. Therefore, this study focused on
determining horizontal and vertical angles (θh and θw), as shown in Fig.3, of a Nikon digital camera using different
optical zoom degree (Z). Then estimate the best mathematical function to represent the relationship between the
camera vision angles and the different object distances for different zoom degrees. Similar works were done
previously, where the resulted data that computed from the introduced model could be compared[7]. The objective
of this study is suggesting a mathematical model to estimate the vertical and horizontal vision angles for the digital
camera by depending on known dimensions of an object. This model estimated by changing the optical zoom of the
camera and the distance between the camera and the object. From this study, it is obvious there is a chance to add
new information details to any camera specification.
2.CAMERA MAIN PARAMETERS
Different digital cameras have different sensor sizes. Width and height of the sensor are the keys to determine
the vision field. A relation between sensor size and focal length determined the angle of vision (AOV) and field of
vision (FOV). A wide angle of vision captures greater real areas, while the smaller angle covers smaller real areas.
Camera vision angle can be changing by changing camera lens focal length [8, 9].
ICCEPS
IOP Conf. Series: Materials Science and Engineering 571 (2019) 012117
IOP Publishing
doi:10.1088/1757-899X/571/1/012117
2
The vision field is the length that the lens will cover at a certain distance that can be classified into the angular
field of vision (AFOV) exactly the same vision angle and linear field of vision(LFOV) unit of distance and
requiring the knowledge of the distance from the lens to the subject [10].
Camera resolution can be defined as the ability of the optical system to record accurate details by distinguishing
between two adjacent spatial, spectral, radiometric, and temporal signals. It also describes the details of the image.
The higher the clarity, the more detail picture will be. Many discrete structures including vision, recording, and
camera lenses each of which has an impact on the process of defining the resolution of the system. In addition to the
environment that plays an important role in the process of photography [11, 12].
Camera zoom is the process of change (increased or decreased) the spatial resolution of the captured images.
There are two cameras zoom types; the first type is the optical zoom which changes image spatial resolution by
changing the distance between the camera lens and the camera sensor. The second type is the digital camera zoom,
where the spatial resolution changed by changed image size [13-15].
3.EXPERIMENTAL AND ALGORITHM
The camera vision angle represents a very important property of a camera. The camera vision angles at a
constant zoom degree measured by calculating the scene object projected area in the 2D image plane. Where any
zoom change in camera setting produces changing in the projected cover area for a real object in the image plane
[9].
Camera angle or shooting angle is the angle of view of the scene that will be recorded or imaged. The study was
based on the Nikon digital camera (D3200SLR) illustrated in Fig. 1, with its specifications listed in Table 1 [15].
Table 1: Nikon Camera Technical specifications [15]
Image device
24.2MP DX-Format
CMOS Sensor
Image resolution
3008 x 2000 (6.0 MP, 3:2)
Focal length lens
Nikon AF-S DX 18-55mm
f/3.5
-
5.6G VR Lens,27 -
83 mm
International
Standardization
Organization
(
ISO ) sensitivity
Auto, 100 - 6400 in steps
of 1 EV; expandable to
12800
LCD screen
3" 921K-Dot LCD
Fig. 1: Nikon digital camera [15]
The Nikon digital camera used to study the mathematical model to be tested later to all types of digital cameras,
because the details and specifications are well known and it is widely use. Hopefully, a new feature can be added to
the characteristics of digital cameras.
In this study MATLAB software used to build algorithms that implemented within this paper. Also, table curve 2D
(v.5.01) software used to determine the best fitting model. Fig.2 shows a mural painting (test image) with
dimensions (9.7×13.5 cm) which placed orthogonal to the camera axis. The captured images, with resolution 24.2
Megapixel fixed, are recorded for different zoom degree (z) (18, 24, 35, 45, and 55 mm) at different distances
(D=1m to 10 m), where the step size is 1m. Then estimate the dimension of the field of view in cm depending on
calculating the dimension of the mural painting. Algorithm 1 converts the dimension from pixel to cm then finally
to get vertical and horizontal vision angles (ߠǡߠ).
ICCEPS
IOP Conf. Series: Materials Science and Engineering 571 (2019) 012117
IOP Publishing
doi:10.1088/1757-899X/571/1/012117
3
Fig. 1. : Represents a scene image (mural painting)
Algorithm: compute camera vision angles
Input
1. The captured image (img)
2. Input test image dimensions (Lreal = 13.5 cm) then select manually the top and bottom right corners (x1, y1) and
(x2, y2), respectively.
Output
1. Object’s length in pixel (L pixel)
2. The scale factor (SF)
Start Algorithm
1. Measure the length of the scene in pixels (length of the plane object) in image plane using a computer mouse by
selecting points in the image plane (x1,y1) and (x2,y2) values between the ends of the plane object in the image,
and compute length using:
ܮሺݔʹݔͳሻ൅ ሺݕʹ െݕͳሺͳሻ
2. Calculate the scale factor
ܵܨ ൌ ܮ௥௘௔௟ሺܿ݉ሻ
ܮሺ݌݅ݔ݈݁ሻሺʹ
3. Convert image dimensions (h and w) from pixels into centimetres using scale factor (SF):
ܪ ൌ ܵܨ ൈ ݄ሺ͵ܽሻ
ܹ ൌ ܵܨ ൈ ݓሺ͵ܾ
4. Calculate vertical and horizontal vision angles based on the vertical and horizontal image dimensions at
distance (D) for each zoom degree (Z) according to the following relationships:
ߠൌʹൈݐܽ݊
ିଵ ܪ
ʹܦሺͶܽሻ
ߠൌʹൈݐܽ݊
ିଵ ܹ
ʹܦሺͶܾሻ
Where D in eq.’s (4a and 4b) represent the distance between the cameras and the objects (ߠܽ݊݀ߠሻas shown in
Fig.3
13.5 cm
(x1, y1)
ICCEPS
IOP Conf. Series: Materials Science and Engineering 571 (2019) 012117
IOP Publishing
doi:10.1088/1757-899X/571/1/012117
4
End Algorithm
Fig. 2. : Angles of the camera vision for the mural image
4.RESULTS AND DISCUSSION
a. Results
Fig. 4 shows the camera vision angles (θw and θh) as a function of the zoom degree (Z) for different distances
(D).
Fig. 3. The relationship between (Z) and (a) horizontal ߠvision angle (b) vertical ߠvision angle
Fig. (4) Shows that there is a decrease in the angle of view (θw and θh) with the increase of zoom degree (Z), i.e.
camera vision angles have an inverse relationship with zoom (Z).
The relation between the ratio (R= θw/θh) and the zoom degree (Z) is extracted for different distances (D) as
shown in Fig. (5). The behaviour of R is almost linear for all zooming values for different distances where the ratio
is equal to 0.69 because the ratio between vertical and horizontal angle is the same for all cases.
0
10
20
30
40
50
60
70
0 5 10 15 20 25 30 35 40 45 50 55 60
0
10
20
30
40
50
60
70
T
w
1 m
2 m
3 m
4 m
5 m
6 m
7 m
8 m
9 m
10 m
(b)
(a)
T
h
Z (mm)
scene
image
ߠ
Distance (
D
)
ߠ
camera
ICCEPS
IOP Conf. Series: Materials Science and Engineering 571 (2019) 012117
IOP Publishing
doi:10.1088/1757-899X/571/1/012117
5
.
Fig. 4:Show the relationship between the ratio and zoom degree (Z)
Table (2 and 3) illustrate the values of the measured vision angles (θw and θh), respectively.
Table 2: The vertical camera vision angle (θw) for each zoom and at each distance (D)
18
24
35 45 55
1
60.99
51.20
35.83
28.34
23.78
2
60.99
48.33
36.04
28.21
23.50
3
58.00
52.76
36.70
28.34
23.88
4
57.06
49.02
36.93
29.03
21.68
5
64.28
48.68
39.85
30.68
28.68
6
60.99
52.76
37.39
29.61
25.10
7
59.97
43.98
34.99
29.03
24.07
8
60.99
45.15
35.62
29.61
24.68
9
59.46
51.58
38.83
31.49
23.59
10
58.9
43.98
34.88
28.82
24.89
Average
60.17
48.75
36.71
29.32
26.58
Table 3: The horizontal camera vision angle (θh) for each zoom and at each distance (D)
18 24 35
45 55
1
42.77
35.34
24.26
19.06
15.94
2
42.77
33.22
24.41
18.97
15.75
3
40.46
36.50
24.87
19.06
17.00
4
39.75
33.73
25.03
19.54
16.55
5
45.34
33.48
27.10
20.67
15.62
6
42.77
36.50
25.63
19.93
16.84
7
41.97
30.06
23.67
19.54
16.14
8
42.77
30.90
24.10
19.93
16.55
9
41.59
35.62
26.38
21.23
15.81
10
41.21
30.06
23.59
19.93
16.69
Average
42.14
33.54
24.88
19.73
16.19
b. Estimated model
The results in different zoom degrees and different distances fitted using two-dimensional table curve software
to obtain the best mathematical model suitable for measuring the vision angles. The best fitting equation between
camera vision angles and zoom degree (Z):
Z mm
0 5 10 15 20 25 30 35 40 45 50 55 60
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1.0
R=T
h
T
w
Z (mm)
1 m
2 m
3 m
4 m
5 m
6 m
7 m
8 m
9 m
10 m
D m
Z mm
D m
ICCEPS
IOP Conf. Series: Materials Science and Engineering 571 (2019) 012117
IOP Publishing
doi:10.1088/1757-899X/571/1/012117
6
ߠ௜ ൌܽ
ܾ
ܼͷ
where ߠ௜is vision angle, ai and bi are constants determined by the type of utilized digital camera, i is h or w.
The resulted data that computed from the introduced model compared [7] where the parameters are different
according to the camera model, but the main model is the same.
The Inverse proportionality between vertical viewing angles (ߠ and ߠ and the zoom degree (Z), for each
distance and at each distance value (D) as shown in Fig. (6) and Fig. (7).
Fig. 4. shows the inverse proportionality of the angle of the camera vision (w) with the Z degree for different distances (D) for the Nikon
camera
ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.99654852 DF Adj r^2=0.99309704 FitStdErr=1.0374566 Fstat=866.19297
a=6.2982397
b=996.46255
15 25 35 45 55
Z(mm)
20
25
30
35
40
45
50
55
60
65
Theta w
20
25
30
35
40
45
50
55
60
65
Theta w
\ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.95655902 DF Adj r^2=0.91311804 FitStdErr=3.5970106 Fstat=66.059216
a=8.2281284
b=954.09618
15 25 35 45 55
Z(mm)
20
25
30
35
40
45
50
55
60
Theta w
20
25
30
35
40
45
50
55
60
Theta w
ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.99102498 DF Adj r^2=0.98204996 FitStdErr=1.1359049 Fstat=331.26109
a=4.4316628
b=674.7003
15 25 35 45 55
Z(mm)
15
20
25
30
35
40
45
Theta h
15
20
25
30
35
40
45
Theta h
ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.99196702 DF Adj r^2=0.98393404 FitStdErr=1.0135335 Fstat=370.46032
a=5.0309706
b=636.63818
15 25 35 45 55
Z(mm)
15
20
25
30
35
40
45
Theta h
15
20
25
30
35
40
45
Theta h
ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.99078122 DF Adj r^2=0.98156244 FitStdErr=1.5686977 Fstat=322.42271
a=7.8592716
b=919.25458
15 25 35 45 55
Z(mm)
20
25
30
35
40
45
50
55
60
Theta w
20
25
30
35
40
45
50
55
60
Theta w
ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.98862332 DF Adj r^2=0.97724664 FitStdErr=1.9303169 Fstat=260.69729
a=6.225535
b=1017.1397
15 25 35 45 55
Z(mm)
20
25
30
35
40
45
50
55
60
65
Theta w
20
25
30
35
40
45
50
55
60
65
Theta w
\ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.99284819 DF Adj r^2=0.98569638 FitStdErr=1.4032426 Fstat=416.47437
a=8.1505321
b=934.56738
15 25 35 45 55
Z(mm)
20
25
30
35
40
45
50
55
60
65
Theta w
20
25
30
35
40
45
50
55
60
65
Theta w
\ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.99139398 DF Adj r^2=0.98278797 FitStdErr=1.2096337 Fstat=345.59335
a=3.084884
b=733.87195
15 25 35 45 55
Z(m)
15
20
25
30
35
40
45
Theta h
15
20
25
30
35
40
45
Theta h
D=1 m
D=2 m
D=4 m
D=3 m
D=6 m
D=5 m
D=8 m
D=7 m
D=10 m
D=9 m
ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.99817898 DF Adj r^2=0.99635797 FitStdErr=0.54277963 Fstat=1644.431
a=3.1515801
b=718.31527
15 25 35 45 55
Z(m)
15
20
25
30
35
40
45
Theta h
15
20
25
30
35
40
45
Theta h
\ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.98053513 DF Adj r^2=0.96107027 FitStdErr=2.4610852 Fstat=151.12383
a=8.3554961
b=987.36303
15 25 35 45 55
Z(mm)
25
30
35
40
45
50
55
60
65
Theta w
25
30
35
40
45
50
55
60
65
Theta w
D=3 m
D=6m
ICCEPS
IOP Conf. Series: Materials Science and Engineering 571 (2019) 012117
IOP Publishing
doi:10.1088/1757-899X/571/1/012117
7
Fig. 5. shows the inverse proportionality of the angle of the camera vision (H) with the Z degree for different distances (D) for the Nikon
camera
A built-in function fitting curves, from software table curve, extracts the fitting coefficients (ai and bi) in Fig. (6
and 7). The correlation value (r2) between the data and their fitting curves are tabled in (4 and 5) for vertical and
ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.98443756 DF Adj r^2=0.96887512 FitStdErr=1.3976144 Fstat=189.77181
a=6.0383394
b=628.32879
15 25 35 45 55
Z(m)
15
20
25
30
35
40
Theta h
15
20
25
30
35
40
Theta h
ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.98443756 DF Adj r^2=0.96887512 FitStdErr=1.3976144 Fstat=189.77181
a=6.0383394
b=628.32879
15 25 35 45 55
Z(m)
15
20
25
30
35
40
Theta h
15
20
25
30
35
40
Theta h
ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.98393725 DF Adj r^2=0.96787449 FitStdErr=1.6142971 Fstat=183.76748
a=4.5454085
b=714.16989
15 25 35 45 55
Z(mm)
15
20
25
30
35
40
45
Theta h
15
20
25
30
35
40
45
Theta h
ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.98881621 DF Adj r^2=0.97763242 FitStdErr=1.2449116 Fstat=265.24543
a=4.2870036
b=661.67722
15 25 35 45 55
Z(mm)
15
20
25
30
35
40
45
Theta h
15
20
25
30
35
40
45
Theta h
\ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.98053513 DF Adj r^2=0.96107027 FitStdErr=2.4610852 Fstat=151.12383
a=8.3554961
b=987.36303
15 25 35 45 55
Z(mm)
25
30
35
40
45
50
55
60
65
Theta w
25
30
35
40
45
50
55
60
65
Theta w
ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.97101994 DF Adj r^2=0.94203987 FitStdErr=2.0608265 Fstat=100.51944
a=5.717712
b=674.2954
15 25 35 45 55
Z(mm)
15
20
25
30
35
40
45
Theta h
15
20
25
30
35
40
45
Theta h
\ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.96520441 DF Adj r^2=0.93040882 FitStdErr=3.1417971 Fstat=83.217822
a=9.9071155
b=935.34112
15 25 35 45 55
Z(mm)
20
25
30
35
40
45
50
55
60
Theta w
20
25
30
35
40
45
50
55
60
Theta w
\ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.9938238 DF Adj r^2=0.98764759 FitStdErr=1.2350662 Fstat=482.73536
a=8.8760259
b=885.58148
15 25 35 45 55
Z(mm)
20
25
30
35
40
45
50
55
60
Theta w
20
25
30
35
40
45
50
55
60
Theta w
\ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.9808972 DF Adj r^2=0.96179439 FitStdErr=2.16316 Fstat=154.045
a=10.226068
b=876.18573
15 25 35 45 55
Z(mm)
20
25
30
35
40
45
50
55
60
Theta w
20
25
30
35
40
45
50
55
60
Theta w
\ZOM.xls
Rank 17 Eqn 17 y=a+b/x
r^2=0.9808972 DF Adj r^2=0.96179439 FitStdErr=2.16316 Fstat=154.045
a=10.226068
b=876.18573
15 25 35 45 55
Z(mm)
20
25
30
35
40
45
50
55
60
Theta w
20
25
30
35
40
45
50
55
60
Theta w
35
4
5
mm
)
D=9 m
35
45
Z(
mm
)
D=10 m
D=8 m
35
45
D=7 m
D=5 m
D=6 m
D=3 m
D=4 m
D=1 m
D=2 m
ICCEPS
IOP Conf. Series: Materials Science and Engineering 571 (2019) 012117
IOP Publishing
doi:10.1088/1757-899X/571/1/012117
8
horizontal vision angles, respectively, for different distances (D). The average values for fitting coefficients (ai) and
(bi) are mentioned at the end of the table.
Table 4: The constant coefficients (ai, bi) of the vertical camera angle (ƟW) for each distance (D)
aw a
w r
2
1
6.225
1017.130
0.999
2
6.298
996.400
0.999
3
8.220
954.000
0.996
4
10.220
876.180
0.993
5
9.960
1008.000
0.999
6
8.355
987.363
0.993
7
7.859
919.250
0.997
8
8.150
934.560
0.995
9
9.907
935.340
0.992
10
8.870
885.580
0.994
Average
8.400
951.380
Table 5: The constant coefficients (ai, bi) of the horizontal camera angle (Ɵh) for each distance (D)
ah b
h r2
1
3.08
733.87
0.998
2
3.15
718.31
0.999
3
6.03
628.33
0.995
4
6.98
630.48
0.994
5
4.37
635.20
0.999
6
4.54
714.16
0.994
7
4.28
661.67
0.996
8
4.43
674.70
0.994
9
5.71
674.29
0.993
10
3.08
636.63
0.992
Average
٤.٥٦
٦٧٠.٧٦
Then used these values to determine the best fitting models, as follows:
ߠௐ ൌܽ
௓ (6) ߠௐ ٨Ǥ٤٠ ٩٥١Ǥ٣٨
௓ (7)
ߠ௛ ൌܽ
௓ (8)
ߠ௛ ٤Ǥ٥٦ ٦٧٠Ǥ٧٦
୞ (9)
Table (6, 7) illustrate the values of measured vision angles (θWe and θhe) from empirical models calculated
according to equations (7 and 9) respectively.
Table 6: The vertical camera vision angle(θWe) for each zoom and at each distance
18 24 35 45 55
1
62.73
48.600
35.28
28.827
24.71
2
61.653
47.814
34.766
28.440
24.41
3
61.220
47.970
35.477
29.420
25.56
4
58.896
46.727
35.253
29.690
26.15
5
65.960
51.960
38.760
32.360
28.28
6
63.208
49.495
36.560
30.296
26.307
7
58.920
46.160
34.123
28.2867
24.572
8
60.070
47.090
34.850
28.910
25.140
9
61.870
48.879
36.630
30.690
26.910
10
58.060
45.760
34.172
28.549
24.970
Average
62.730
48.600
35.280
28.827
24.710
D m
Constant
Coefficient
Constant
Coefficient
D m
D m
Z mm
D m
ICCEPS
IOP Conf. Series: Materials Science and Engineering 571 (2019) 012117
IOP Publishing
doi:10.1088/1757-899X/571/1/012117
9
Table 7: The horizontal camera vision angle(θhe) for each zoom and at each distance
18
24 35 45 55
1
43.85
33.66
24.05
19.39
16.42
2
43.05
33.07
23.67
19.11
16.21
3
40.93
32.21
23.98
19.99
17.45
4
42.00
33.25
24.99
20.99
18.44
5
45.65
30.83
26.51
18.48
15.91
6
44.21
34.29
24.94
20.41
17.52
7
41.03
31.84
23.18
18.98
16.31
8
41.91
32.54
23.70
19.42
16.69
9
43.17
33.80
24.97
20.69
15.96
10
38.45
29.61
21.27
17.23
14.65
43.85
33.66
24.05
19.39
16.42
The resulted two empirical models (ߠ௪௘ܽ݊݀ߠ௛௘ have been tested for different D and Z values to compute camera
vision angles from equation (7 and 9), respectively. The comparison between the real practical values (ߠܽ݊݀ߠ
and the estimated values (ߠ௪௘ܽ݊݀ߠ௛௘) is achieved. The results of comparison are shown in tables (8 and 9), and
indicate a good agreement between real and estimated values with low absolute errors value can be noted.
Therefore, the introduced method depends on a mathematical background and matches the real world dimensions.
The difficulty is matching the shouting angles of the used camera and understands how it works and designs a
model.
Table 8: Compression between (ƟW) theoretically and practically, and calculate the proportion of error between them at distances (D)
D (m)
Z
(mm)
Ɵ
w
Ɵ
we
Absolute
error %
1
18
60.99
62.73
2
24
51.20
48.60
5
35
35.83
35.28
1
45
28.34
28.83
1
55
23.78
24.71
3
3
18
58.00
٦١.٢٢
5
24
52.76
٤٧.٩٧
9
35
36.70
٣٥.٤8
3
45
28.34
٢٩.٤٢
3
55
23.88
٢٥.٥٦
12
5
18
64.28
٦٥.٩٦
2
24
48.68
٥١.٩٦
6
35
39.85
٣٨.٧٦
2
45
30.68
٣٢.٣٦
5
55
28.68
٢٨.٢٨
1
7
18
59.97
٥٨.٩٢
1
24
43.98
٤٦.١٦
4
35
34.99
٣٤.١٢
2
45
29.03
٢٨.٢٨
2
55
24.07
٢٤.٥٧
2
9
18
59.46
61.87
3
24
51.58
48.88
5
35
38.83
36.63
6
45
31.49
30.69
2
55
26.59
26.91
1
D m
Z mm
ICCEPS
IOP Conf. Series: Materials Science and Engineering 571 (2019) 012117
IOP Publishing
doi:10.1088/1757-899X/571/1/012117
10
Table 9: Compression between (ƟH) theoretically and practically, and calculate the proportion of error between them at distances (D)
5.CONCLUSION
The modelling of the vision angles and its relationship with the different optical zoom is achieved using an
image processing algorithm. The resulted mathematical model can be used for different type of camera. But the only
difference will be the parameters a and b only. These parameters can be estimated easily for different optical zoom
and different distance from the camera and the test image. The ratio between ߠ and ߠ (R 0.69) is constant for a
long range of distances (D) and zoom degree (Z), which means that this feature represents one of the parameters that
can be depended for Nikon camera. As a result, it is one of a good way to determine camera quality or
specifications. It is now possible to know this feature for this type or other types of camera.
REFERENCES
[1] SHERVIN S., ALESSANDRO F. 2014 CAMERA AS THE INSTRUMENT: THE RISING TREND OF VISION BASED
MEASUREMENT, IEEE INSTRUMENTATION & MEASUREMENT MAGAZINE , VOL. 17, NO. 3,PP. 41 - 47.
[2] FILIPE, S., ALEXANDRE, L. A. 2013, FROM THE HUMAN VISUAL SYSTEM TO THE COMPUTATIONAL MODELS OF
VISUAL ATTENTION: A SURVEY.ARTIF. INTELL. REV.147; DOI:10.1007/S10462-012-9385-4.
[3] J.N. STIRMAN, I.T. SMITH, M.W. KUDENOV, AND S.L. SMITH 2014, WIDE-FIELD-OF-VIEW, TWIN-REGION TWO-
PHOTON IMAGING ACROSS EXTENDED CORTICAL NETWORKS". BIORXIV. DOI: HTTP://DX.DOI.ORG/10.1101/011320.
[4] McMahan E D R a D A B a R K a C S a S S a R P 2015 Effects of Field of View and Visual Complexity on
Virtual Reality Training Effectiveness for a Visual Scanning Task IEEE Transactions on Visualization &
Computer Graphics 21 794-807
[5] Wilding D, Pozzi P, Soloviev O, Vdovin G, Sheppard C J and Verhaegen M 2016 Pupil filters for extending
the field-of-view in light-sheet microscopy Optics Letters 41 1205-8
[6] Sihler H, Lübcke P, Lang R, Beirle S, de Graaf M, Hörmann C, Lampel J, Penning de Vries M, Remmers J,
Trollope E, Wang Y and Wagner T 2017 In-operation field-of-view retrieval (IFR) for satellite and ground-based
DOAS-type instruments applying coincident high-resolution imager data Atmos. Meas. Tech. 10 881-903
[7] Heba Kh. Abass A H M A-S, Ali Abid D. Al-Zuky 2015 Estimate Mathematical Model to Calculate the View
Angle Depending on the Camera Zoom International Journal of Scientific & Engineering Research 6 497-504
[8] Carr D 2017 ANGLE OF VIEW VS. FIELD OF VIEW. IS THERE A DIFFERENCE AND DOES IT EVEN
MATTER? (https://shuttermuse.com/angle-of-view-vs-field-of-view-fov-aov/
D (m)
Z
(mm)
Ɵ
h
Ɵ
he
Absolute
error %
1
18
42.77
43.85
2.0
24
35.34
33.66
4.0
35
24.26
24.05
0.8
45
19.06
19.39
1.0
55
15.94
16.42
2.0
3
18
40.46
40.93
1.0
24
36.50
32.21
13.0
35
24.87
23.98
3.0
45
19.06
19.99
4.0
55
17.00
17.45
2.0
5
18
45.35
45.65
0.6
24
33.48
30.83
8.0
35
27.10
26.51
2.0
45
20.67
18.48
11.0
55
15.62
15.91
1.0
7
18
41.97
41.03
2.0
24
30.06
31.84
5.0
35
23.67
23.18
2.0
45
19.54
18.98
2.0
55
16.14
16.31
1.0
9
18
41.59
43.17
3.0
24
35.62
33.80
5.0
35
26.38
24.97
5.0
45
21.23
20.69
2.0
55
15.81
15.96
0.9
ICCEPS
IOP Conf. Series: Materials Science and Engineering 571 (2019) 012117
IOP Publishing
doi:10.1088/1757-899X/571/1/012117
11
[9] Glaholt M G 2016 Field of view requirements for night vision devices. (DRDC Toronto Research Centre)
[10] Wynne J B C, Randolph H 2011 Introduction to remote sensing: The Guilford Press; 5th edition (June 21)
[11] Pajdla T a s 2011 Stereo geometries of non-central cameras. (Czech Technical University in Prague)
[12] Seitz S M 1997 image based transformation of viewpoint and scene appearance. In: computer science
department: University of Wisconsin Madison)
[13] Lu M-C, Hsu C-C and Lu Y-Y 2010 Distance and angle measurement of distant objects on an oblique plane
based on pixel variation of CCD image. In: Instrumentation and Measurement Technology Conference (I2MTC),
2010 IEEE: IEEE Xplore) pp 318-22
[14] Kouskouridas R, Gasteratos A and Badekas E 2012 Evaluation of two-part algorithms for objects' depth
estimation IET Computer Vision 6 70-8
[15] https://www.imaging-resource.com/PRODS/nikon-d3200/nikon-d3200DAT.HTM
... However, since heterogeneity in a Fuzzy structure is managed by applying linguistically specified sets, some researches have shown that it is possible specific rule to execute the recognition task based on the membership of a given input to the different defined Fuzzy sets [16]. Fuzzy logic is inherently superior to processing precise results, which is natural for this use [17]. Data treatment is required so that it can be conveniently represented in the Fuzzy laws. ...
... Any preprocessing would be needed before attempting to classify the numerals. [17]. ...
Article
The Fuzzy Logic method was implemented to detect and recognize English numbers in this paper. The extracted features within this method make the detection easy and accurate. These features depend on the crossing point of two vertical lines with one horizontal line to be used from the Fuzzy logic method, as shown by the Matlab code in this study. The font types are Times New Roman, Arial, Calabria, Arabic, and Andalus with different font sizes of 10, 16, 22, 28, 36, 42, 50 and 72. These numbers are isolated automatically with the designed algorithm, for which the code is also presented. The number’s image is tested with the Fuzzy algorithm depending on six-block properties only. Groups of regions (High, Medium, and Low) for each number showed unique behavior to recognize any number. Normalized Absolute Error (NAE) equation was used to evaluate the error percentage for the suggested algorithm. The lowest error was 0.001% compared with the real number. The data were checked by the support vector machine (SVM) algorithm to confirm the quality and the efficiency of the suggested method, where the matching was found to be 100% between the data of the suggested method and SVM. The six properties offer a new method to build a rule-based feature extraction technique in different applications and detect any text recognition with a low computational cost.
... The scaling factor (Scf) can be used to estimate the real object length in image plane as follow [15,16]: ...
Conference Paper
In spite of its transparency, the determination process for the properties of submersible object into water is a difficult task. One of these difficulties arises from the differences in refractive indices between two different mediums such as air and water which changes the visual appearance of the immersed object. Such difficulties will provide unprecise measurements and hence the need for analyzing and having a mathematical model for such situation is a critical need. This paper introduced seven mathematical models to simulate the properties of two bodies (floating and submersible objects) under the effect of light attenuation and refraction phenomena caused by differences in refractive indices between air and water. Results show an accuracy in describing object’s properties in underwater environment for both floating and submersible used bodies. The proposed models gave a good indicator in detecting underwater object’s properties with lowest values for Mean Square Error (MSE) which had been used here. The lowest value for MSE had been detected upon scaling factor followed by Magnification and object’s length estimator at final.
Article
Full-text available
Knowledge of the field of view (FOV) of a remote sensing instrument is particularly important when interpreting their data and merging them with other spatially referenced data. Especially for instruments in space, information on the actual FOV, which may change during operation, may be difficult to obtain. Also, the FOV of ground-based devices may change during transportation to the field site, where appropriate equipment for the FOV determination may be unavailable. This paper presents an independent, simple and robust method to retrieve the FOV of an instrument during operation, i.e. the two-dimensional sensitivity distribution, sampled on a discrete grid. The method relies on correlated measurements featuring a significantly higher spatial resolution, e.g. by an imaging instrument accompanying a spectrometer. The method was applied to two satellite instruments, GOME-2 and OMI, and a ground-based differential optical absorption spectroscopy (DOAS) instrument integrated in an SO2 camera. For GOME-2, quadrangular FOVs could be retrieved, which almost perfectly match the provided FOV edges after applying a correction for spatial aliasing inherent to GOME-type instruments. More complex sensitivity distributions were found at certain scanner angles, which are probably caused by degradation of the moving parts within the instrument. For OMI, which does not feature any moving parts, retrieved sensitivity distributions were much smoother compared to GOME-2. A 2-D super-Gaussian with six parameters was found to be an appropriate model to describe the retrieved OMI FOV. The comparison with operationally provided FOV dimensions revealed small differences, which could be mostly explained by the limitations of our IFR implementation. For the ground-based DOAS instrument, the FOV retrieved using SO2-camera data was slightly smaller than the flat-disc distribution, which is assumed by the state-of-the-art correlation technique. Differences between both methods may be attributed to spatial inhomogeneities. In general, our results confirm the already deduced FOV distributions of OMI, GOME-2, and the ground-based DOAS. It is certainly applicable for degradation monitoring and verification exercises. For satellite instruments, the gained information is expected to increase the accuracy of combined products, where measurements of different instruments are integrated, e.g. mapping of high-resolution cloud information, incorporation of surface climatologies. For the SO2-camera community, the method presents a new and efficient tool to monitor the DOAS FOV in the field.
Article
Full-text available
Pupil filters, represented by binary phase modulation, have been applied to extend the field of view of a light-sheet fluorescence microscope. Optimization has been used, first numerically to calculate the optimum filter structure and then experimentally, to scale and align the numerically synthesized filter in the microscope. A significant practical extension of the field of view has been observed, making the reported approach a valuable tool on the path to wide-field light-sheet microscopy.
Article
Full-text available
A mathematical model to estimate the camera view angle for a certain object has been found based on camera zoom where the fitting curves for the practical data of the view angles () in the image plane which decreased with increasing distance (D) for each zoom number(Z) of the used camera were achieved. Then find the mathematical modeling equation that relates view angles (), real distance () and zoom number (Z) the comparison between theoretical estimation and practical result for the camera view angles and zoom number give a very good agreement between them where the estimated vertical and horizontal camera view angles very close to the real measurements. Index Terms— View angle, camera resolution, mathematical model, estimated view angle.
Article
Full-text available
Virtual reality training systems are commonly used in a variety of domains, and it is important to understand how the realism of a training simulation influences training effectiveness. We conducted a controlled experiment to test the effects of display and scenario properties on training effectiveness for a visual scanning task in a simulated urban environment. The experiment varied the levels of field of view and visual complexity during a training phase and then evaluated scanning performance with the simulator's highest levels of fidelity and scene complexity. To assess scanning performance, we measured target detection and adherence to a prescribed strategy. The results show that both field of view and visual complexity significantly affected target detection during training; higher field of view led to better performance and higher visual complexity worsened performance. Additionally, adherence to the prescribed visual scanning strategy during assessment was best when the level of visual complexity during training matched that of the assessment conditions, providing evidence that similar visual complexity was important for learning the technique. The results also demonstrate that task performance during training was not always a sufficient measure of mastery of an instructed technique. That is, if learning a prescribed strategy or skill is the goal of a training exercise, performance in a simulation may not be an appropriate indicator of effectiveness outside of training-evaluation in a more realistic setting may be necessary.
Article
Full-text available
Due to continuing and rapid advances of both hardware and software technologies in camera and computing systems, we continue to have access to cheaper, faster, higher quality, and smaller cameras and computing units. As a result, vision based methods consisting of image processing and computational intelligence can be implemented more easily and affordably than ever using a camera and its associated operations units. Among their various applications, such systems are also being used more and more by researchers and practitioners as generic instruments to measure and monitor physical phenomena. In this article, we take a look at this rising trend and how cameras and vision are being used for instrumentation and measurement, and we also cast a glance at the metrological gauntlet thrown down by vision-based instruments.
Preprint
We demonstrate a two-photon imaging system with corrected optics including a custom objective that provides cellular resolution across a 3.5 mm field of view (9.6 mm^2). Temporally multiplexed excitation pathways can be independently repositioned in XY and Z to simultaneously image regions within the expanded field of view. We used this new imaging system to measure activity correlations between neurons in different cortical areas in awake mice.
Book
A leading text for undergraduate- and graduate-level courses, this book introduces widely used forms of remote sensing imagery and their applications in plant sciences, hydrology, earth sciences, and land use analysis. The text provides comprehensive coverage of principal topics and serves as a framework for organizing the vast amount of remote sensing information available on the Web. Featuring case studies and review questions, the book's 4 sections and 21 chapters are carefully designed as independent units that instructors can select from as needed for their courses. Illustrations include 29 color plates and over 400 black-and-white figures. New to This Edition Reflects significant technological and methodological advances. Chapter on aerial photography now emphasizes digital rather than analog systems. Updated discussions of accuracy assessment, multitemporal change detection, and digital preprocessing. Links to recommended online videos and tutorials.
Article
With the emerging interest in active vision, computer vision researchers have been increasingly concerned with the mechanisms of attention. Therefore, a number of visual attention computational models, inspired by the Human Visual System (HVS), have been developed. These models aim to detect regions of interest in images. In recent decades, psychologists, neurobiologists and engineers have investigated visual attention and their cooperation has resulted in considerable benefits. However, the interdisciplinarity of the subject has not only benefits but also difficulties, since the concepts of different fields are often difficult to access due to differences in vocabulary and lack of knowledge of relevant literature. The purpose of this paper is to bring together concepts and ideas from these different research areas. It provides an extensive survey of biological research on HVS, as well as the state-of-art on computational models of visual attention (bottom-up). The methods are presented depending on their classification: biological plausible, computational or hybrid. A discussion of the interdisciplinary knowledge of the HVS is also included.
Article
This paper presents an image-based system for measuring target objects on an oblique plane based on pixel number variation of charge-coupled device images for digital cameras by referencing to two arbitrarily designated points in the image frame. Based on an established relationship between the displacement of the camera movement along the photographing direction and the variation in pixel counts between the reference points in the images, photographic distance and incline angle for objects lying on an oblique plane can be calculated via the proposed method. As a real-case application of the proposed approach, 2-D localization of target objects in robot soccer competitions is also demonstrated to show the effectiveness of the proposed approach. To allow the use of widely available digital zoom cameras for ranging and localization by the proposed method, a parameter equivalent to the displacement due to the camera movement is also investigated and derived in this paper.